Pi-puck and Elisa-3 Aseba: Difference between pages

From GCtronic wiki
(Difference between pages)
Jump to navigation Jump to search
 
 
Line 1: Line 1:
=Hardware=
=Introduction=
==Overview==
Aseba is a set of tools which allow novices to program robots easily and efficiently, refer to [https://www.thymio.org/en:start https://www.thymio.org/en:start] for more information. <br/>
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/pipuck-overview.jpg <img width=600 src="https://projects.gctronic.com/epuck2/wiki_images/pipuck-overview-small.jpg">]</span><br/>
Features:
* Raspberry Pi Zero W or Zero 2 W connected to the robot via I2C
* interface between the robot base camera and the rPi via USB, up to 15 FPS
* 1 digital microphone and 1 speaker
* USB hub connected to the rPi with 2 free ports
* uUSB cable to the rPi uart port. Also ok for charging
* 2 chargers. 1 for the robot battery and 1 for the auxiliary battery on top of the extension
* charging contact points in front for automatic charging. External docking station available
* several extension options. 6 i2C channels, 2 ADC inputs
* several LED to show the status of the rPi and the power/chargers


==I2C bus==
==Prerequisites==
I2C is used to let communicate various elements present in the robot, Pi-puck and extensions. An overall schema is shown in the following figure:<br/>
The following steps neeed to be done only once:
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/i2c-buses.png <img width=600 src="https://projects.gctronic.com/epuck2/wiki_images/i2c-buses.png">]</span><br/>
# The communication between Aseba and Elisa-3 is done through the USB (serial communication is used) cable so you need to install the driver, refer to section [https://www.gctronic.com/doc/index.php/Elisa-3#Requirements Elisa-3 requirements]; in the future we will maybe add RF support too
An I2C switcher is included in the Pi-puck extension in order to support additional I2C buses (the RPi alone has only one usable I2C bus). These are needed to avoid conflicts between Time-of-Flight sensors that have a fixed I2C address.
# [https://projects.gctronic.com/elisa3/aseba-bin-1.5.5-git-b858c2e-win32.exe Download] and install Aseba version '''1.5.5'''
# Download the Elisa-3 target for Aseba [https://projects.gctronic.com/elisa3/elisa3-aseba.hex elisa3-aseba.hex] and upload it to the robot (refer to section [https://www.gctronic.com/doc/index.php/Elisa-3#Programming Elisa-3 Programming])


=Getting started=
==Connection with AsebaStudio==
This introductory section explains the minimal procedures needed to work with the Raspberry Pi Zero W / Zero 2 W mounted on the Pi-puck extension board and gives a general overview of the available basic demos and scripts shipped with the system flashed on the micro SD. More advanced demos are described in the following separate sections (e.g. ROS), but the steps documented here are fundamental, so be sure to fully understand them. <br/>
The following steps explain how to start playing with the Aseba Studio:<br/>
1. Connect the robot to the computer if not already done and turn it on<br/>
2. Download the following script based on your platform and modify its content specifying the AsebaStudio installation folder and the robot port:<br/>
* Windows: [https://projects.gctronic.com/elisa3/asebaswitch_elisa3.bat asebaswitch_elisa3.bat]; spcifiy the installation folder (e.g. <code>C:\Program Files (x86)\AsebaStudio</code>) and the port number (e.g. <code>10</code> for <code>COM10</code>)
* Linux / Mac OS: [https://projects.gctronic.com/elisa3/asebaswitch_elisa3.sh asebaswitch_elisa3.sh]; specifiy the installation folder (e.g. <code>/usr/bin</code> in Linux or <code>/Applications/Aseba/bin</code> in Mac OS) and the port (e.g. <code>/dev/ttyUSB0</code> in Linux or <code>/dev/cu.usbserial-XXXXX</code> in Mac OS)
<!-- # Start the ''asebaswitch'' tool by issueing the command:<br/> <code>asebaswitch -d -v "ser:port=104;baud=57600;stop=1;parity=none;fc=none;bits=8"</code> <br/> you need only to specify the correct <code>port</code> number (''COMx''). The robot will blink if the connection is correctly opened. <br/> For more information about the parameters refer to [https://github.com/aseba-community/dashel/tree/master/docs https://github.com/aseba-community/dashel/tree/master/docs]. <br/> You can find ''asebaswitch'' in the ''AsebaStudio'' installation folder (e.g. <code>C:\Program Files (x86)\AsebaStudio</code>).
-->
3. Start the script:
* Windows: double click on the bat file
* Linux / Mac OS: set the script to be executable with the command <code>chmod +x asebaswitch_elisa3.sh</code> and then execute it <code>./asebaswitch_elisa.sh</code>
4. Start ''AsebaStudio'' and select <code>Network(TCP)</code>, insert <code>localhost</code> as <code>Host</code> and specify <code>33333</code> as <code>Port</code> to open the connection with the robot<br/>
5. If the connection is correctly established you should see the Elisa-3 variables on the left side of ''AsebaStudio'' as shown in the following figure:<br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/aseba-screenshot2.jpg <img width=400 src="https://www.gctronic.com/doc/images/aseba-screenshot2-small.jpg">]</span>


The extension is mostly an interface between the e-puck robot and the Raspberry Pi, so you can exploit the computational power of a Linux machine to extend the robot capabilities.<br/>
Have a look also at the following video (you must use the script instead of manually issueing the command as in the video):<br/>
{{#ev:youtube|0jrTgt7F1iM}}


In most cases, the Pi-puck extension will be attached to the robot, but it's interesting to note that it can be used also alone when the interaction with the robot isn't required.<br/>
==Simple test==
The following sections assume the full configuration (robot + extension), unless otherwise stated.
Once the connection is opened click on the checkbox ''auto'' to start updating the sensors data automatically; we can now interact with the robot, for instance on the left side we can see all the sensors values (proximity, ground, accelerometer, ...) and we can change the motor speed and turn on/off all the leds.


==Requirements==
=Software=
The robot must be programmed with a special firmware in order to communicate via I2C bus with the Raspberry Pi mounted on the Pi-puck extension. The same I2C bus is shared by all the devices (camera, IMU, distance sensor, others extensions), the main microcontroller and the Raspberry Pi. Since the Raspberry Pi acts as I2C master, these devices will not be anymore reachable directly from the robot main microcontroller that will act instead as I2C slave.
First of all hava a look at some of the examples proposed in the following section [https://www.gctronic.com/doc/index.php/Elisa-3_Aseba#AsebaStudio_examples AsebaStudio examples].<br/>
Then when you're ready you can start programming the robot on your own, refer to section [https://www.gctronic.com/doc/index.php/Elisa-3_Aseba#Programming_interface Programming interface]; moreover you can have a look at [https://www.thymio.org/en:start https://www.thymio.org/en:start] for more information.<br/>
<font style="color:red">Pay attention that you have 100 bytes availables for your script due to memory constraints.</font><br/>
If you want to have a look behind the scene refer to section [https://www.gctronic.com/doc/index.php/Elisa-3_Aseba#Contribute_to_the_Elisa-3_Aseba_target Contribute to the Elisa-3 Aseba target].


===e-puck version 1===
==AsebaStudio examples==
The e-puck version 1 robot must be programmed with the following firmware [https://raw.githubusercontent.com/yorkrobotlab/pi-puck/master/e-puck1/pi-puck-e-puck1.hex pi-puck-e-puck1.hex].
You can download all the following examples from [https://projects.gctronic.com/elisa3/aseba-elisa3-examples.zip aseba-elisa3-examples.zip]; in order to launch an example follow these steps:
# place the robot selector in position 5. When the robot is turned on, the position of the selector (from 0 to 9) define the node name in ''AsebaStudio'', in our case the node name will be <code>elisa3-5</code> (where ''5'' is the selector position)
# extract the zip, the directory contains some file with ''aesl'' extension, this is the ''AsebaStudio'' code extension
# connect the robot with ''AsebaStudio'' as explained previously
# click on menu <code>File => Open...</code> and select one of the examples extracted from the zip
# click on button <code>Load</code> and then on <code>Run</code>; now the code is running on the robot but it isn't stored in EEPROM, thus when you turn off the robot it returns to its initial state
If you want to save the program in memory you need to click on <code>Tools => Write the program(s)... => inside the elisa3</code> and wait for the programming termination (the green leds around the robot will be turned on while the memory is written); <font style="color:red">pay attention to uncheck the ''auto'' update of the robot variables in ''AsebaStudio'' before starting the writing (with the ''auto'' update enabled the writing could block)</font>.<br/>


===e-puck version 2===
===Simple obstacle avoidance===
The e-puck version 2 robot must be programmed with the following firmware [https://projects.gctronic.com/epuck2/gumstix/e-puck2_main-processor_extension_b346841_07.06.19.elf  e-puck2_main-processor_extension.elf (07.06.19)] and the selector must be placed in position 10(A).<br/>
<pre>
The source code is available in the <code>gumstix</code> branch of the repo <code>https://github.com/e-puck2/e-puck2_main-processor</code>.
var i = 1
while (i==1) do
if ((prox[0] > 50) or (prox[1] > 50) or (prox[7] > 50)) then
mot.left.target = -20
mot.right.target = 20
else
mot.left.target = 20
mot.right.target = 20
end
end
</pre>
To fully see the results of this example you need to write the code into the robot. Then let it move with some objects around.


==Turn on/off the extension==
===RGB control===
To turn on the extension you need to press the <code>auxON</code> button as shown in the follwoing figure; this will turn on also the robot (if not already turned on). Similarly, if you turn on the robot then also the extension will turn on automatically.<br/>
<pre>
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/pipuck_btn_on_off.jpg <img width=250 src="https://projects.gctronic.com/epuck2/wiki_images/pipuck_btn_on_off-small.jpg">]</span><br/>
onevent ir.sensors
if (prox[0] > 20) then # avoid noise
led.rgb[0] = prox[0]
else
led.rgb[0] = 0
end
if (prox[1] > 20) then # avoid noise
led.rgb[1] = prox[1]
else
led.rgb[1] = 0
end
if (prox[7] > 20) then # avoid noise
led.rgb[2] = prox[7]
else
led.rgb[2] = 0
end
</pre>


To turn off the Pi-puck you need to press and hold the <code>auxON</code> button for 2 seconds; this will initiate the power down procedure.<br>
Once the code is loaded on the robot you can "control" the intensity of the red, green and blue with the ''prox[0]'', ''prox[1]'' and ''prox[7]'' respectively. Try to get the primary and secondary colors ([https://en.wikipedia.org/wiki/Primary_color https://en.wikipedia.org/wiki/Primary_color])...''hint: you need two fingers''.


Beware that by turning off the robot, the extension will not be turned off automatically if it is powered from another source like the micro usb cable or a secondary battery. You need to use its power off button to switch it off. Instead if there is no other power source, then by turning off the robot also the extension will be turned off (not cleanly).
===Working with events===
<pre>
var color = 0


==Console mode==
onevent ir.sensors
The Pi-puck extension board comes with a pre-configured system ready to run without any additional configuration.<br/>
led.green[0] = 1 - led.green[0]
In order to access the system from a PC in console mode, the following steps must be performed:<br/>
led.green[2] = 1 - led.green[2]
1. connect a micro USB cable from the PC to the extension module. If needed, the drivers are available in the following link [https://www.silabs.com/products/development-tools/software/usb-to-uart-bridge-vcp-drivers USB to UART bridge drivers]<br/>
led.green[4] = 1 - led.green[4]
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/pipuck_usb.png <img width=250 src="https://projects.gctronic.com/epuck2/wiki_images/pipuck_usb-small.png">]</span><br/>
led.green[6] = 1 - led.green[6]
2. execute a terminal program and configure the connection with 115200-8N1 (baudrate, 8 bits, no flow control). The serial device is the one created when the extension is connected to the computer<br/>
3. switch on the robot (the extension will turn on automatically); now the terminal should display the Raspberry Pi booting information. If the robot isn't present, then you can directly power on the extension board with the related button<br/>
4. login with <code>user = pi</code>, <code>password = raspberry</code><br/>


==Battery charge==
onevent acc
You can either charge the robot battery or the additional battery connected to the Pi-puck extension or both the batteries by simply plugging the micro USB cable.<br/>
led.green[1] = 1 - led.green[1]
The following figure shows the connector for the additional battery.<br/>
led.green[3] = 1 - led.green[3]
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/pipuck_battery.jpg <img width=250 src="https://projects.gctronic.com/epuck2/wiki_images/pipuck_battery-small.jpg">]</span><br/>
led.green[5] = 1 - led.green[5]
led.green[7] = 1 - led.green[7]


The robot can also autonomously charge itself if the charging wall is available. The Pi-puck extension includes two spring contacts on the front side that let the robot easily make contact with the charging wall and charge itself. The charging wall and the spring contacts are shown in the following figures:<br/>
onevent timer
<span class="plainlinks">[https://www.gctronic.com/img2/shop/pipuck-charger-robot.jpg <img width=250 src="https://www.gctronic.com/img2/shop/pipuck-charger-robot-small.jpg">]</span>
led.rgb[color] = 255 - led.rgb[color]
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/pipuck_contacts.jpg <img width=250 src="https://projects.gctronic.com/epuck2/wiki_images/pipuck_contacts-small.jpg">]</span><br/>
onevent button
if (color == 2) then
color = 0
else
color++
end
led.rgb[0] = 0
led.rgb[1] = 0
led.rgb[2] = 0
</pre>
The green leds shows the update frequency of the proximity and accelerometer sensors (you can measure it with an oscilloscope if you have one). You can change the value of the variable <code>timer.period</code> to change the frequency of the RGB LED, the resolution is 1 ms (e.g. by putting 1000 you'll get the RGB LED blinking at 1 Hz). Moreover you can try pressing the button and see what happen (probably you will deduce from the code...).


==Reset button==
===Remote control===
A button is available to reset the robot, when pressed it will resets only the robot restarting its firmware. This is useful for instance during development or for specific demos in which a restart of the robot is needed. In these cases you don't need to turn off completely the robot (and consequently also the Pi-puck if energy is supplied by the robot) but instead you can simply reset the robot. The position of the reset button is shown in the following figure:<br/>
<pre>
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/pipuck_reset.png <img width=250 src="https://projects.gctronic.com/epuck2/wiki_images/pipuck_reset-small.png">]</span><br/>
onevent rc5
if rc5 == 2 then # forward
mot.left.target = 20
mot.right.target = 20
elseif rc5 == 5 then # stop
mot.left.target = 0
mot.right.target = 0
elseif rc5 == 8 then # backward
mot.left.target = -20
mot.right.target = -20
elseif rc5 == 4 then # left
mot.left.target = 0
mot.right.target = 20
elseif rc5 == 6 then # right
mot.left.target = 20
mot.right.target = 0
else # error
led.rgb[0] = 255
end
</pre>
To fully see the results of this example you need to write the code into the robot. Maybe you should adapt the values used for the various motions and then you can for sure extend the functionalities using others codes (e.g. change RGB LED color).


=How to communicate with the robot and its sensors=
===Simple local communication===
==Communicate with the e-puck version 1==
In this example we need to connect two robots at the same time to ''AsebaStudio'', to accomplish this ''asebaswitch'' need to be called with a different command that is:<br/>
Refer to the repo [https://github.com/yorkrobotlab/pi-puck-e-puck1 https://github.com/yorkrobotlab/pi-puck-e-puck1].
<code>asebaswitch -d -v "ser:port=104;baud=57600;stop=1;parity=none;fc=none;bits=8" "ser:port=69;baud=57600;stop=1;parity=none;fc=none;bits=8"</code><br/>
 
Basically there are two targets instead of one; you need to specify the correct <code>port</code> number for both the robots. Moreover you need to place the robot receiver selector in position 5 and the robot transmitter selector to another position (from 0 to 9). Both the robots will blink if the connection is correctly opened.<br/>
==Communicate with the e-puck version 2==
Load the following code to the receiver robot:
An example showing how to exchange data between the robot and the Pi-puck extension is available in the Pi-puck repository; you can find it in the directory <code>/home/pi/Pi-puck/e-puck2/</code>.<br/>
<pre>
You can build the program with the command <code>gcc e-puck2_test.c -o e-puck2_test</code>.<br/>
call prox.comm.enable(1)
Now you can run the program by issueing <code>./e-puck2_test</code>; this demo will print the sensors data on the terminal and send some commands to the robot at 2 Hz.<br/>
onevent prox.comm
The same example is also available in Python, you can run it by issueing <code>python3 e-puck2_test.py</code>.
led.green[0] = 0
 
led.green[1] = 0
===Packet format===
led.green[2] = 0
Extension to robot packet format, 20 bytes payload (the number in the parenthesis represents the bytes for each field):
led.green[3] = 0
{| border="1"
led.green[4] = 0
| Left speed (2)
led.green[5] = 0
| Right speed (2)
led.green[6] = 0
| Speaker (1)
led.green[7] = 0
| LED1, LED3, LED5, LED7 (1)
led.green[prox.comm.rx.id] = 1
| LED2 RGB (3)
if (prox.comm.rx == 1) then
| LED4 RGB (3)
led.rgb[0] = 255
| LED6 RGB (3)
led.rgb[1] = 0
| LED8 RGB (3)
led.rgb[2] = 0
| Settings (1)
elseif (prox.comm.rx == 2) then
| Checksum (1)
led.rgb[0] = 0
|}
led.rgb[1] = 255
* Left, right speed: [-2000 ... 2000]
led.rgb[2] = 0
* Speaker: sound id = [0, 1, 2]
elseif (prox.comm.rx == 3) then
* LEDs on/off flag: bit0 for LED1, bit1 for LED3, bit2 for LED5, bit3 for LED7
led.rgb[0] = 0
* RGB LEDs: [0 (off) ... 100 (max)]
led.rgb[1] = 0
* Settings:
led.rgb[2] = 255
** bit0: 1=calibrate IR proximity sensors
end
** bit1: 0=disable onboard obstacle avoidance; 1=enable onboard obstacle avoidance (not implemented yet)
</pre>
** bit2: 0=set motors speed; 1=set motors steps (position)
Load the following line code to the transmitter robot:
* Checksum: Longitudinal Redundancy Check (XOR of the bytes 0..18)
<pre>
 
call prox.comm.enable(1)
Robot to extension packet format, 47 bytes payload (the number in the parenthesis represents the bytes for each field):
</pre>
{| border="1"
Now you can change the <code>prox.comm.tx</code> values (from 1 to 3) on the transmitter tab and see the effect on the receiver robot; also <code>prox.comm.rx</code> and <code>prox.comm.rx.id</code> on the receiver tab will change accordingly. You can easily transform the id to an angle by knowing that each sensor is placed at 45 degrees from each other. Remember to place the robots near each other (< 5 cm).
| 8 x Prox (16)
| 8 x Ambient (16)
| 4 x Mic (8)
| Selector + button (1)
| Left steps (2)
| Right steps (2)
| TV remote (1)
| Checksum
|}
* Selector + button: selector values represented by 4 least significant bits (bit0, bit1, bit2, bit3); button state is in bit4 (1=pressed, 0=not pressed)
* Checksum: Longitudinal Redundancy Check (XOR of the bytes 0..45)
 
==Communicate with the IMU==
===e-puck version 1===
An example written in C showing how to read data from the IMU (LSM330) mounted on e-puck version 1.3 is available in the Pi-puck repository; you can find it in the directory <code>/home/pi/Pi-puck/e-puck1/</code>.<br/>
You can build the program with the command <code>gcc e-puck1_imu.c -o e-puck1_imu</code>.<br/>
Now you can run the program by issueing <code>./e-puck1_imu</code> and then choose whether to get data from the accelerometer or gyroscope; this demo will print the sensors data on the terminal.<br/>
 
===e-puck version 2===
An example showing how to read data from the IMU (MPU-9250) is available in the Pi-puck repository; you can find it in the directory <code>/home/pi/Pi-puck/e-puck2/</code>.<br/>
You can build the program with the command <code>gcc e-puck2_imu.c -o e-puck2_imu</code>.<br/>
Now you can run the program by issueing <code>./e-puck2_imu</code> and then choose whether to get data from the accelerometer or gyroscope; this demo will print the sensors data on the terminal.<br/>
The same example is also available in Python, you can run it by issueing <code>python3 e-puck2_imu.py</code>.
 
==Communicate with the ToF sensor==
The Time of Flight sensor is available only on the e-puck version 2 robot.<br/>
 
First of all you need to verify that the VL53L0X Python package is installed with the following command: <code>python3 -c "import VL53L0X"</code>. If the command returns nothing you're ready to go, otherwise if you receive an <code>ImportError</code> then you need to install the package with the command: <code>pip3 install git+https://github.com/gctronic/VL53L0X_rasp_python</code>.<br/>
 
A Python example showing how to read data from the ToF sensor is available in the Pi-puck repository; you can find it in the directory <code>/home/pi/Pi-puck/e-puck2/</code>.<br/>
You can run the example by issueing <code>python3 VL53L0X_example.py</code> (this is the example that you can find in the repository [https://github.com/gctronic/VL53L0X_rasp_python/tree/master/python https://github.com/gctronic/VL53L0X_rasp_python/tree/master/python]).
 
==Capture an image==
The robot camera is connected to the Pi-puck extension as a USB camera, so you can access it very easily.<br/>
An example showing how to capture an image from the robot's camera using OpenCV is available in the Pi-puck repository; you can find it in the directory <code>/home/pi/Pi-puck/snapshot/</code>.<br/>
You can build the program with the command <code>g++ $(pkg-config --libs --cflags opencv) -ljpeg -o snapshot snapshot.cpp</code>.<br/>
Now you can run the program by issueing <code>./snapshot</code>; this will save a VGA image (JPEG) named <code>image01.jpg</code> to disk.<br/>
The program can accept the following parameters:<br/>
<code>-d DEVICE_ID</code> to specify the input video device from which to capture an image, by default is <code>0</code> (<code>/dev/video0</code>). This is useful when working also with the [http://www.gctronic.com/doc/index.php?title=Omnivision_Module_V3 Omnivision V3] extension that crates another video device; in this case you need to specify <code>-d 1</code> to capture from the robot camera.<br/>
<code>-n NUM</code> to specify how many images to capture (1-99), by default is 1<br/>
<code>-v</code> to enable verbose mode (print some debug information)<br/>
Beware that in this demo the acquisition rate is fixed to 5 Hz, but the camera supports up to '''15 FPS'''.<br/>
The same example is also available in Python, you can run it by issueing <code>python snapshot.py</code>.
 
==Communicate with the ground sensors extension==
Both e-puck version 1 and e-puck version 2 support the [https://www.gctronic.com/doc/index.php?title=Others_Extensions#Ground_sensors ground sensors extension].<br/>
This extension is attached to the I2C bus and can be read directly from the Pi-puck.<br/>
An example written in C showing how to read data from the ground sensors extension is available in the Pi-puck repository; you can find it in the directory <code>/home/pi/Pi-puck/ground-sensor/</code>.<br/>
You can build the program with the command <code>gcc groundsensor.c -o groundsensor</code>.<br/>
Now you can run the program by issueing <code>./groundsensor</code>; this demo will print the sensors data on the terminal.<br/>
The same example is also available in Python, you can run it by issueing <code>python3 groundsensor.py</code>.
 
==Communicate with the range and bearing extension==
Both e-puck version 1 and e-puck version 2 support the [https://www.gctronic.com/doc/index.php?title=Others_Extensions#Range_and_bearing range and bearing extension].<br/>
This extension is attached to the I2C bus and can be read directly from the Pi-puck.<br/>
An example written in C showing how to start playing with the range and bearing extension is available in the Pi-puck repository; you can find it in the directory <code>/home/pi/Pi-puck/randb/</code>. You need two boards: one is the transmitter (run <code>randb_tx</code>) and the other is the receiver (run <code>randb_rx</code>). The receiver will print the data received from the transmitter.<br/>
You can build the programs with the command <code>gcc randb_tx.c -o randb_tx</code> and <code>gcc randb_rx.c -o randb_rx</code>.<br/>
The same example is also available in Python, you can run it by issueing <code>python3 randb_tx.py</code> and <code>python3 randb_rx.py</code>.<br/>
 
For best performances you need also to take in consideration the interference given by the time of flight and proximity sensors (see [https://www.gctronic.com/doc/index.php?title=Others_Extensions#e-puck_2 https://www.gctronic.com/doc/index.php?title=Others_Extensions#e-puck_2]).
 
==Wireless remote control==
If you want to control the robot from a computer, for instance when you have an algorithm that requires heavy processing not suitable for the Pi-puck or when the computer acts as a master controlling a fleet of robots that return some information to the controller, then you have 3 options:<br/>
1) The computer establishes a WiFi connection with the Pi-puck to receive data processed by the Pi-puck (e.g. results of an image processing task); at the same time the computer establishes a Bluetooth connection directly with the e-puck version 2 robot to control it.
:''Disadvantages'':
:- the Bluetooth standard only allow up to seven simultaneous connections
:- doubled latency (Pi-puck <-> pc and pc <-> robot)
2) The computer establishes a WiFi connection with both the Pi-puck and the e-puck version 2 robot.
:''Advantages'':
:- only one connection type needed, easier to handle
:''Disadvantages'':
:- doubled latency (Pi-puck <-> pc and pc <-> robot)
3) The computer establishes a WiFi connection with the Pi-puck and then the Pi-puck is in charge of controlling the robot via I2C based on the data received from the computer controller.
:''Advantages'':
:- less latency involved
:- less number of connections to handle
:- depending on your algorithm, it would be possible to initially develop the controller on the computer (easier to develop and debug) and then transfer the controller directly to the Pi-puck without the need to change anything related to the control of the robot via I2C
 
The following figure summarizes these 3 options:<br/>
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/wireless-remote-control-options.png <img width=600 src="https://projects.gctronic.com/epuck2/wiki_images/wireless-remote-control-options.png">]</span>
 
=How to work with the Pi-puck=
==Demos and scripts update==
First of all you should update to the last version of the demos and scripts released with the system that you can use to start playing with the Pi-puck extension and the robot.<br/>
To update the repository follow these steps:<br/>
1. go to the directory <code>/home/pi/Pi-puck</code><br/>
2. issue the command <code>git pull</code><br/>
Then to update some configurations of the system:<br/>
1. go to the directory <code>/home/pi/Pi-puck/system</code><br/>
2. issue the command <code>./update.sh</code>; the system will reboot.<br/>
You can find the Pi-puck repository here [https://github.com/gctronic/Pi-puck https://github.com/gctronic/Pi-puck].<br/>
 
==Audio recording==
Use the <code>arecord</code> utility to record audio from the onboard microphone. The following example shows how to record an audio of 2 seconds (<code>-d</code> parameter) and save it to a wav file (<code>test.wav</code>):<br/>
<code>arecord -Dmic_mono -c1 -r16000 -fS32_LE -twav -d2 test.wav</code><br/>
You can also specify a rate of 48 KHz with <code>-r48000</code>
 
==Audio play==
Use <code>aplay</code> to play <code>wav</code> files and <code>mplayer</code> to play <code>mp3</code> files.
 
==Battery reading==
A Python example showing how to measure both the battery of the robot and the battery of the Pi-puck extension is available in the Pi-puck repository; you can find it in the directory <code>/home/pi/Pi-puck/battery/</code>.<br/>
You can start reading the batteries values by issueing <code>python read-battery.py</code>.; this demo will print the batteries values (given in Volts) on the terminal.<br/>
An additional Python example is provided in the same directory showing how to detect when the robot is in charge: this is a basic demo in which the robot goes forward and stop only when it is in charge, it can be used as a starting point for more advanced examples. To run this demo issue the commands <code>sudo pigpiod</code> and then <code>python3 goto-charge.py</code>.


==WiFi configuration==
===Square path===
Specify your network configuration in the file <code>/etc/wpa_supplicant/wpa_supplicant-wlan0.conf</code>.<br/>
In this example we exploit the onboard odometry to let the robot move in a square path. You have two possibilities: either running on a vertical wall or running on an horizontal plane, both cases are handled automatically. When the Elisa-3 is turned on, it calibrates the sensors (when is placed vertically it rotates around itself for a while with the green led turned on); when the calibration process is finished you can start the square by "touching" the back side proximity. Pay attention that the robot must be placed with the front pointing right when placed vertically.
Example:<br/>
<div class="toccolours mw-collapsible mw-collapsed">
<pre>
<pre>
ctrl_interface=DIR=/var/run/wpa_supplicant GROUP=netdev
var DISTANCE = 100 # given in mm
update_config=1
var start = 0
country=CH
var state = 0
network={
var isVertical
        ssid="MySSID"
        psk="9h74as3xWfjd"
}
</pre>
</pre>
You can have more than one <code>network</code> parameter to support more networks. For more information about ''wpa_supplicant'' refer to [https://hostap.epitest.fi/wpa_supplicant/ https://hostap.epitest.fi/wpa_supplicant/].
<div class="mw-collapsible-content">
<pre>
sub updateState
if start == 1 then
if (state == 0) then
mot.left.target = 20
mot.right.target = 20
led.rgb[0] = 255
led.rgb[1] = 0
led.rgb[2] = 0
if (odom.x >= DISTANCE) then
state = 1
end
elseif (state == 1) then
mot.left.target = 0
mot.right.target = 15
led.rgb[0] = 0
led.rgb[1] = 255
led.rgb[2] = 0
if (odom.theta >= 90) then
state = 2
end
elseif (state == 2) then
mot.left.target = 20
mot.right.target = 20
led.rgb[0] = 0
led.rgb[1] = 0
led.rgb[2] = 255
if (odom.y >= DISTANCE) then
state = 3
end
elseif (state == 3) then
mot.left.target = 0
mot.right.target = 20
led.rgb[0] = 255
led.rgb[1] = 255
led.rgb[2] = 0
call robot.isVertical(isVertical)
if (isVertical == 1) then
if (odom.theta < 0) then
state = 4
end
else
if (odom.theta >= 180) then
state = 4
end
end
elseif (state == 4) then
mot.left.target = 20
mot.right.target = 20
led.rgb[0] = 255
led.rgb[1] = 0
led.rgb[2] = 255
if (odom.x <= 0) then
state = 5
end
elseif (state == 5) then
mot.left.target = 0
mot.right.target = 20
led.rgb[0] = 0
led.rgb[1] = 255
led.rgb[2] = 255
call robot.isVertical(isVertical)
if (isVertical == 1) then
if ((odom.theta >= -90) and (odom.theta < 0) ) then
state = 6
end
else
if (odom.theta >= 270) then
state = 6
end
end
elseif (state == 6) then
mot.left.target = 20
mot.right.target = 20
led.rgb[0] = 0
led.rgb[1] = 0
led.rgb[2] = 0
if (odom.y <= 0) then
state = 7
end
elseif (state == 7) then
mot.left.target = 0
mot.right.target = 20
led.rgb[0] = 0
led.rgb[1] = 0
led.rgb[2] = 0
call robot.isVertical(isVertical)
if (isVertical == 1) then
if (odom.theta >= 0) then
state = 8
end
else
if (odom.theta >= 360) then
state = 8
end
end
elseif (state == 8) then
mot.left.target = 0
mot.right.target = 0
start = 0
end
end


Once the configuration is done, you can also connect to the Pi-puck with <code>SSH</code>. If you are working in Windows you can use [https://www.putty.org/ PuTTY].
onevent ir.sensors
 
if (start == 0) then
===How to know your IP address===
if (prox[4] > 200) then
A simple method to know your IP address is to connect the USB cable to the Pi-puck extension and issue the command <code>ip a</code>; from the command's result you will be able to get you current assigned IP address.
call reset.odometry()
 
state = 0
If you prefer to know your IP address remotely (without connecting any cable) then you can use <code>nmap</code>.<br/>
start = 1
For example you can search all connected devices in your network with the following command: <code>nmap 192.168.1.*</code>. Beware that you need to specify the subnet based on your network configuration.<br/>
end
From the command's result you need to look for the hostname <code>raspberrypi</code>.<br/>
end
If you are working in Windows you can use the [https://nmap.org/zenmap/ Zenmap] application.
callsub updateState
 
</pre>
==File transfer==
</div>
===USB cable===
</div>
You can transfer files via USB cable between the computer and the Pi-puck extension by using on of the <code>zmodem</code> protocol.<br/>
The <code>lrzsz</code> package is pre-installed in the system, thus you can use the <code>sx</code> and <code>rx</code> utilities to respectevely send files to the computer and receive files from the computer.<br/>
Example of sending a file to the computer using the <code>Minicom</code> terminal program:<br/>
1. in the Pi-puck console type <code>sx --zmodem fliename.ext</code>. The transfer should start automatically and you'll find the file in the home directory.<br/>
<!--2. to start the transfer type the sequence <code>CTRL+A+R</code>, then chose <code>xmodem</code> and finally enter the name you want to assign to the received file. You'll find the file in the home directory.<br/>-->
Example of receiving a file from the computer using the <code>Minicom</code> terminal program:<br/>
1. in the Pi-puck console type <code>rx -Z</code><br/>
2. to start the transfer type the sequence <code>CTRL+A+S</code>, then chose <code>zmodem</code> and select the file you want to send with the <code>spacebar</code>. Finally press <code>enter</code> to start the transfer.<br/>
===WiFi===
The Pi-puck extension supports <code>SSH</code> connections.<br/>
To exchange files between the Pi-puck and the computer, the <code>scp</code> tool (secure copy) can be used. An example of transferring a file from the Pi-puck to the computer is the following:<br/>
<code>scp pi@192.168.1.20:/home/pi/example.txt example.txt</code>
 
If you are working in Windows you can use [https://www.putty.org/ PuTTY].
 
==Image streaming==
 
 
==Bluetooth LE==
An example of a ''BLE uart service'' is available in the Pi-puck repository; you can find it in the directory <code>/home/pi/Pi-puck/ble/</code>.<br/>
To start the service you need to type: <code>python uart_peripheral.py</code>.<br/>
Then you can use the ''e-puck2-android-ble app'' you can find in chapter [https://www.gctronic.com/doc/index.php?title=e-puck2_mobile_phone_development#Connecting_to_the_BLE Connecting to the BLE] in order to connect to the Pi-puck extension via BLE. Once connected you'll receive some dummy data for the proximity values and by clicking on the motion buttons you'll see the related action printed on the Pi-puck side. This is a starting point that you can extend based on your needs.
 
=Operating system=
The system is based on Raspbian Stretch and can be downloaded from the following link [https://projects.gctronic.com/epuck2/PiPuck/pi-puck-os_25.05.22.zip pi-puck-os_25.05.22.zip].
 
When booting the first time, the first thing to do is expanding the file system in order to use all the available space on the micro sd:<br/>
1. <code>sudo raspi-config</code><br/>
2. Select <code>Advanced Options</code> and then <code>Expand Filesystem</code><br/>
3. reboot
 
==e-puck version 2 camera configuration==
The e-puck version 2 camera need to be configured through I2C before it can be used. For this reason a Python script is called at boot that detects and configures the camera. The script resides in the Pi-puck repository installed in the system (<code>/home/pi/Pi-puck/camera-configuration.py</code>), so beware to not remove it.
 
If the robot is plugged after the boot process is completed, you need to call manually the Python configuration script before using the camera by issueing the command <code>python3 /home/pi/Pi-puck/camera-configuration.py</code>.
 
In order to automatically run the script at boot, the <code>/etc/rc.local</code> was modified by adding the call to the script just before the end of the file.
 
==Power button handling==
The power button press is handled by a background service (<code>systemd</code>) started automatically at boot. The service description file is located in <code>/etc/systemd/system/power_handling.service</code> and it calls the <code>/home/pi/power-handling/</code> program. Beware to not remove neither of these files.<br/>
The source code of the power button handling program is available in the Pi-puck repository and is located in <code>/home/pi/Pi-puck/power-handling/power-handling.c</code>.
 
==Desktop mode==
The system starts in console mode, to switch to desktop (LXDE) mode issue the command <code>startx</code>.
===Camera viewer===
A camera viewer called <code>luvcview</code> is installed in the system. You can open a terminal and issue simply the command <code>luvcview</code> to see the image coming from the robot camera.
 
==VNC==
[https://www.realvnc.com/en/ VNC] is a remote control desktop application that lets you connect to the Pi-puck from your computer and then you will see the desktop of the Pi-puck inside a window on your computer. You'll be able to control it as though you were working on the Pi-puck itself.<br/>
VNC is installed in the system and the ''VNC server'' is automatically started at boot, thus you can connect with ''VNC Viewer'' from your computer by knowing the IP address of the Pi-puck (refer to section [https://www.gctronic.com/doc/index.php?title=Pi-puck#How_to_know_your_IP_address How to know your IP address]).<br/>
Notice that the ''VNC server'' is started also in console mode.
 
==I2C communication==
The communication between the Pi-puck extension and the robot is based on I2C. The system is configured to exploit the I2C hardware peripheral in order to save CPU usage, but if you need to use the software I2C you can enable it by modifying the <code>/boot/config.txt</code> file and removing the <code>#</code> symbol (comment) in front of the line with the text <code>dtparam=soft_i2c</code> (it is placed towards the end of the file).
 
==Audio output configuration==
You can enable or disable audio output by modifying the <code>config.txt</code> file in the <code>boot</code> partition.<br/>
To enable audio output insert the line: <code>gpio=22=op,dh</code><br/>
To disable audio output insert the line: <code>gpio=22=op,dl</code><br/>
If you don't need to play audio files it is suggested to disable audio output in order to save power.
 
=ROS=
ROS Kinetic is integrated in the Pi-puck system.<br/>
A ROS node developed to run in the Pi-puck is available for both <code>CPP</code> and <code>Python</code>, the communication system is based on the third architecture shown in chapter [https://www.gctronic.com/doc/index.php?title=Pi-puck#Wireless_remote_control Wireless remote control]; a more detailed schema is shown below:<br/>
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/epuck2-ros-schema.png <img width=600 src="https://projects.gctronic.com/epuck2/wiki_images/epuck2-ros-schema.png">]</span>
 
==Initial configuration==
The ROS workspace is located in <code>~/rosbots_catkin_ws/</code><br/>
The e-puck version 2 ROS driver is located in <code>~/rosbots_catkin_ws/src/epuck_driver_cpp/</code><br/>
Remember to follow the steps in the section [http://www.gctronic.com/doc/index.php?title=Pi-puck#Requirements Requirements ] and section [https://www.gctronic.com/doc/index.php?title=Pi-puck#Demos_and_scripts_update Demos and scripts update], only once.<br/>
The PC (if used) and the Pi-puck extension are supposed to be configured in the same network.
 
==Running roscore==
<code>roscore</code> can be launched either from the PC or directly from the Pi-puck.<br/>
Before starting roscore, open a terminal and issue the following commands:
* <code>export ROS_IP=roscore-ip</code>
* <code>export ROS_MASTER_URI=http://roscore-ip:11311</code>
where <code>roscore-ip</code> is the IP of the machine that runs <code>roscore</code><br/>
Then start <code>roscore</code> by issueing <code>roscore</code>.
 
==Running the ROS node==
Before starting the e-puck version 2 ROS node on the Pi-puck, issue the following commands:
* <code>export ROS_IP=pipuck-ip</code>
* <code>export ROS_MASTER_URI=http://roscore-ip:11311</code>
where <code>pipuck-ip</code> is the IP of the Pi-puck extension and <code>roscore-ip</code> is the IP of the machine that runs <code>roscore</code> (can be the same IP if <code>roscore</code> runs directly on the Pi-puck).
 
To start the e-puck version 2 ROS node issue the command:<br/>
<code>roslaunch epuck_driver_cpp epuck_minimal.launch debug_en:=true ros_rate:=20</code><br/>
<!--
To start the e-puck version 2 ROS node issue the command:<br/>
<code>roslaunch epuck_driver_cpp epuck_controller.launch epuck_id:='3000'</code><br/>
This launch file will start the e-puck2 node and the camera node.
If you are using a PC, then you can start <code>rviz</code>:
* in a terminal issue the command <code>rviz rviz</code>
* open the configuration file named <code>single_epuck_driver_rviz.rviz</code> you can find in <code>epuck_driver_cpp/config/</code> directory
-->
 
The following graph shows all the topics published by the e-puck version 2 driver node:<br/>
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/ros-e-puck2_.jpg <img width=150 src="https://projects.gctronic.com/epuck2/wiki_images/ros-e-puck2_small.jpg">]</span>
''<font size="2">Click to enlarge</font>''
 
==Test the communication==
You can test if the communication between the robot and the computer is actually working by simply display messages published by a topic, e.g.:<br/>
<code>rostopic echo /proximity0</code><br/>
You can have the list of all the topics by issuing the command: <code>rostopic list</code>.<br/>
You can move the robot straight forward by issuing <code>rostopic pub -1 /mobile_base/cmd_vel geometry_msgs/Twist -- '[4.0, 0.0, 0.0]' '[0.0, 0.0, 0.0]'</code>.<br/>
You can rotate the robot on the spot by issuing <code>rostopic pub -1 /mobile_base/cmd_vel geometry_msgs/Twist -- '[0.0, 0.0, 0.0]' '[0.0, 0.0, 1.0]'</code>.<br/>
You can change the LEDs state by issuing <code>rostopic pub -1 /mobile_base/cmd_led std_msgs/UInt8MultiArray "{data: [LED1, LED3, LED5, LED7]}"</code>, e.g. <code>rostopic pub -1 /mobile_base/cmd_led std_msgs/UInt8MultiArray "{data: [1, 1, 1, 1]}"</code> to turn them all on.<br/>
 
==Get the source code==
The last version of the e-puck version 2 ROS node can be downloaded from the git: <code>git clone -b pi-puck https://github.com/gctronic/epuck_driver_cpp.git</code><br/>
 
To update to the last version follow these steps:
# <code>cd ~/rosbots_catkin_ws/src/</code>
# <code>rm -R -f epuck_driver_cpp</code>
# <code>git clone -b pi-puck https://github.com/gctronic/epuck_driver_cpp.git</code>
# <code>cd ~/rosbots_catkin_ws/</code>
# <code>catkin_make --only-pkg-with-deps epuck_driver_cpp</code>


==Python version==
==Contribute to the Elisa-3 Aseba target==
A Python version developed by the York University can be found here [https://github.com/yorkrobotlab/pi-puck-ros https://github.com/yorkrobotlab/pi-puck-ros].
You can get the source code of the Elisa-3 Aseba target from [https://github.com/aseba-community/aseba-targets-arduino.git github].<br/>
The repo contains all Arduino targets, the Elisa-3 target is placed in the directory ''elisa3''. To build the project follow these steps:
# clone the repo by issueing the command: <code>git clone --recursive https://github.com/gctronic/aseba-targets-arduino.git</code>
# download [https://www.atmel.com/Microsite/atmel-studio/ Atmel Studio 7] since this IDE was used to create the project; the installation of ''Atmel Studio'' includes also the toolchain so you should be able to build the project without any modification
# to open the project double click <code>elisa3-aseba.atsln</code>


=ROS 2=
===Dependencies===
ROS2 Foxy running on Raspberry Pi OS Buster is available in the following link [https://drive.google.com/file/d/150bHaXz-NGelogcMq1AlOrvf-U5tPT_g/view?usp=sharing ros2_foxy_epuck2.img].
The project depends on some files of [https://github.com/aseba-community/aseba Aseba] that is included as submodule in the [https://github.com/aseba-community/aseba-targets-arduino.git aseba-targets-arduino repo] to simplify the building. The files referenced from the project are:
* <code>aseba\trunk\vm\natives.c</code>
* <code>aseba\trunk\vm\natives.h</code>
* <code>aseba\trunk\vm\vm.c</code>
* <code>aseba\trunk\vm\vm.h</code>
* <code>aseba\trunk\transport\buffer\vm-buffer.c</code>
* <code>aseba\trunk\transport\buffer\vm-buffer.h</code>


==Running ROS 2 node==
The project depends also on the Elisa-3 library contained in the [https://www.gctronic.com/doc/index.php/Elisa-3#Advanced_demo ''Elisa-3 advanced firmware revision 221''].
To start the robot node issue the command <code>ros2 run epuck_ros2_driver driver</code>.<br/>
To start the camera node issue the command <code>ros2 run epuck_ros2_camera camera</code>.


==Test the communication==
=Programming interface=
You can test if the communication between the robot and the computer is actually working by simply display messages published by a topic, e.g.:<br/>
This page describes the programming capabilities of Elisa-3. It lists the different <font style="color:red">variables</font>, <font style="color:green">events</font> and <font style="color:blue">functions</font> and indicates to which elements of the robot they refer (see section [https://www.gctronic.com/doc/index.php/Elisa-3#Hardware Hardware] to know where is the actual position on the robot of the sensors and actuators). Each variable is marked with either ''[R]'' or ''[W]'' to indicate whether the variable is used to read a value from the robot or write a value to the robot respectively. This page refers to firmware revision 0 and later.<br/>
<code>ros2 topic echo /tof</code><br/>
You can find a document that summarizes the programming interface in the following link [https://projects.gctronic.com/elisa3/elisa3-aseba-cheatsheet.png elisa3-aseba-cheatsheet.png].
You can have the list of all the topics by issuing the command: <code>ros2 topic list</code>.<br/>
You can move the robot straight forward by issuing <code>ros2 topic pub -1 /cmd_vel geometry_msgs/Twist "{linear:{x: 2.0, y: 0.0, z: 0.0}, angular:{x: 0.0, y: 0.0, z: 0.0}}"</code>.<br/>
You can change the LEDs state by issuing <code>ros2 topic pub -1 /led0 std_msgs/msg/Bool "{data: true}"</code>.


==Get the source code==
==Standard library==
The last version of the e-puck version 2 ROS 2 node can be downloaded from the git: <code>git clone https://github.com/cyberbotics/epuck_ros2.git</code><br/>
The Elisa-3 comes with the Aseba [https://www.thymio.org/en:asebastdnative standard library of native functions, documented on its own page].


=OpenCV=
==Motors==
OpenCV 3.4.1 is integrated in the Pi-puck system.
You can change the wheel speeds by writing in these variables:
* <font style="color:red"><code>motor.left.target</code></font> ''[W]'': requested speed for left wheel
* <font style="color:red"><code>motor.right.target</code></font> ''[W]'': requested speed for right wheel
You can read the real wheel speeds from these variables:
* <font style="color:red"><code>motor.left.speed</code></font> ''[R]'': real speed of left wheel
* <font style="color:red"><code>motor.right.speed</code></font> ''[R]'': real speed of right wheel
The values range from -127 to 127, one unit = 5 mm/s. A value of 127 approximately corresponds to a linear speed of 60 cm/s.


=York Robotics Lab Expansion Board=
==Green LEDs==
The York Robotics Lab developed an expansion board for the Pi-puck extension that includes: 9-DoF IMU, 5-input navigation switch, RGB LED, XBee socket, 24-pin Raspberry Pi compatible header. For more information have a look at [https://pi-puck.readthedocs.io/en/latest/extensions/yrl-expansion/ https://pi-puck.readthedocs.io/en/latest/extensions/yrl-expansion/].<br/>
8 green LEDs make up a circle on the bottom of the robot.<br/>
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/yrl-expansion-top.jpg <img width=350 src="https://projects.gctronic.com/epuck2/wiki_images/yrl-expansion-top.jpg">]</span><br/>
<font style="color:red"><code>led.green[0..7]</code></font> ''[W]'': index 0 sets the intensity of the LED at the front of the robot, the others are numbered clockwise.


An example showing how to communicate with the YRL expansion board is available in the Pi-puck repository of the York Robotics Lab:
==RGB LED==
# <code> git clone https://github.com/yorkrobotlab/pi-puck.git pi-puck_yrl</code>
There is one RGB LED in the center of the robot, its light is smoothly spread out through the top diffuser.<br/>
# <code>cd pi-puck_yrl/python-library</code>
<font style="color:red"><code>led.rgb[0..2]</code></font> ''[W]'': the indexes 0, 1 and 2 set respectively the intensity of the red, green and blue.<br/>
# <code>python3 pipuck-library-test.py -x</code> Once started, press in sequence up, down, left, right, center to continue the demo.
The values range from 0 (off) to 255 (max intensity).


==Assembly==
==IR transmitters==
The assembly is very simple: place the YRL expansion board on top of the Raspberry Pi and then connect them with the provided screws. Once they are connected, you can attach both on top of the Pi-puck extension.<br/>
There are 3 IR transmitters pointing upwards, two placed in the front side of the robot and one placed in the back side. You can control their state by writing these variables:<br/>
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/yrl-exp1.jpg <img width=200 src="https://projects.gctronic.com/epuck2/wiki_images/yrl-exp1.jpg">]</span>
* <font style="color:red"><code>ir.tx.front</code></font> ''[W]'': 0 means that both front IRs are turned off, 1 means that both front IRs are turned on
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/yrl-exp2.jpg <img width=150 src="https://projects.gctronic.com/epuck2/wiki_images/yrl-exp2.jpg">]</span>
* <font style="color:red"><code>ir.tx.back</code></font> ''[W]'': 0 means that the back IR is turned off, 1 means that the back IR is turned off
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/yrl-exp3.jpg <img width=200 src="https://projects.gctronic.com/epuck2/wiki_images/yrl-exp3.jpg">]</span><br/>
==XBee==
In this section it is explained how to send data from the Pi-puck to the computer using XBee modules Series 1.


The XBee module mounted on the YRL expansion must be programmed with the <code>XBEE 802.15.4-USB ADAPTER</code> firmware; this can be done with the [http://www.digi.com/products/wireless-wired-embedded-solutions/zigbee-rf-modules/xctu XTCU software]. With XTCU be sure to program also the same parameters on both modules in order to be able to communicate between each other: <code>Channel</code> (e.g. <code>C</code>), <code>PAN ID</code> (e.g. <code>3332</code>), <code>DH = 0</code>, <code>DL = 0</code>, <code>MY = 0</code>.
==Button==
There is a small button in the back side of the Elisa-3. The variable <font style="color:red"><code>button</code></font> ''[R]'' holds the state of this button (1 = released, 0 = pressed).<br/>
Elisa-3 generates the <font style="color:green"><code>button</code></fonT> event when it is pressed or released.


Some Python examples ara available in the [https://github.com/yorkrobotlab/pi-puck-expansion-board YRL Expansion Board GitHub repository] that can be used to communicate with the XBee module mounted on the YRL expansion. These examples are based on the [https://github.com/digidotcom/xbee-python Digi XBee Python library] that can be installed with the command <code>pip3 install digi-xbee</code>. This library requires the XBee module to be configured in API mode; you can setup this mode following these steps:
==Proximity sensors==
# <code> git clone https://github.com/yorkrobotlab/pi-puck-expansion-board.git</code>
Elisa-3 has 8 proximity sensors around its periphery (placed at 45 degrees from each other). Two arrays of 8 variables hold the values of these sensors, the first is <font style="color:red"><code>prox</code></font> ''[R]'' and represents the proximity to an object, the second is <font style="color:red"><code>prox.ambient</code></font> ''[R]'' and represents the ambient light intensity:
# <code>cd pi-puck-expansion-board/xbee</code>
* <font style="color:red"><code>prox[0]</code></font>, <font style="color:red"><code>prox.ambient[0]</code></font> : front
# <code>python3 xbee-enable-api-mode.py</code>
* <font style="color:red"><code>prox[1]</code></font>, <font style="color:red"><code>prox.ambient[1]</code></font> : front right
* <font style="color:red"><code>prox[2]</code></font>, <font style="color:red"><code>prox.ambient[2]</code></font> : right
* <font style="color:red"><code>prox[3]</code></font>, <font style="color:red"><code>prox.ambient[3]</code></font> : back right
* <font style="color:red"><code>prox[4]</code></font>, <font style="color:red"><code>prox.ambient[4]</code></font> : back
* <font style="color:red"><code>prox[5]</code></font>, <font style="color:red"><code>prox.ambient[5]</code></font> : back left
* <font style="color:red"><code>prox[6]</code></font>, <font style="color:red"><code>prox.ambient[6]</code></font> : left
* <font style="color:red"><code>prox[7]</code></font>, <font style="color:red"><code>prox.ambient[7]</code></font> : front left


Now connect the second module to the computer and run XTCU, select the console view and open the serial connection. Then run the [https://projects.gctronic.com/epuck2/PiPuck/xbee-send-broadcast.py xbee-send-broadcast.py] example from the Pi-puck by issuing the command: <code>python3 xbee-send-broadcast.py</code>. From the XTCU console you should receive <code>Hello Xbee World!</code>.
The values in the <font style="color:red"><code>prox</code></font> array vary from 0 (the robot does not see anything) to 255 (the robot is very close to an obstacle); the values of the <font style="color:red"><code>prox.ambient</code></font> array start from 1023 when completely dark and decrease with light increase. Elisa-3 updates these arrays at a frequency of about 80 Hz (when local communication is disabled), and generates the <font style="color:green"><code>ir.sensors</code></font> event after every update.


For more information refer to [https://pi-puck.readthedocs.io/en/latest/extensions/yrl-expansion/xbee/ https://pi-puck.readthedocs.io/en/latest/extensions/yrl-expansion/xbee/].
==Ground sensors==
Elisa-3 holds 4 ground proximity sensors. These sensors are located at the front of the robot. As black grounds appear like no ground at all (black absorbs the infrared light), these sensors can be used to follow a line on the ground and also to avoid falling from the table. Two arrays of 4 variables hold the values of these sensors, the first is <font style="color:red"><code>ground</code></font> ''[R]'' and represents the proximity to the ground or the presence of a black line, the second is <font style="color:red"><code>ground.ambient</code></font> ''[R]'' and represents the ambient light intensity at the ground:
* <font style="color:red"><code>ground[0]</code></font>, <font style="color:red"><code>ground.ambient[0]</code></font> : left
* <font style="color:red"><code>ground[1]</code></font>, <font style="color:red"><code>ground.ambient[1]</code></font> : front left
* <font style="color:red"><code>ground[2]</code></font>, <font style="color:red"><code>ground.ambient[2]</code></font> : front right
* <font style="color:red"><code>ground[3]</code></font>, <font style="color:red"><code>ground.ambient[3]</code></font> : right


=Time-of-Flight Distance Sensor add-on=
The values in the <font style="color:red"><code>ground</code></font> array normally vary from about 600 (white surface) to about 300 (black surface or no ground); the values of the <font style="color:red"><code>prox.ambient</code></font> array start from 1023 when completely dark and decrease with light increase. Elisa-3 updates these arrays at a frequency of about 80 Hz (when local communication is disabled), and generates the same <font style="color:green"><code>ir.sensors</code></font> event after every update.
The Pi-puck extension integrates six sensor board sockets that can be used to add up to six VL53L1X-based distance sensor add-ons. The Pi-puck equipped with these add-ons is shown in the following figure:<br/>
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/pi-puck-tof.jpg <img width=250 src="https://projects.gctronic.com/epuck2/wiki_images/pi-puck-tof.jpg">]</span><br/>
For more information have a look at [https://pi-puck.readthedocs.io/en/latest/extensions/tof-sensor/#time-of-flight-distance-sensor https://pi-puck.readthedocs.io/en/latest/extensions/tof-sensor/#time-of-flight-distance-sensor].


<font style="color:red"> Beware that once the socket for the ToF add-on sensor '''3''' is soldered on the pi-puck extension, you are no more able to connect the HDMI cable.</font>
==Accelerometer==
Elisa-3 contains a 3-axes accelerometer. An array of 3 variables, <font style="color:red"><code>acc</code></font> ''[R]'', holds the values of the acceleration along these 3 axes:
* <font style="color:red"><code>acc[0]</code></font> : x-axis (from back to front, positive forward)
* <font style="color:red"><code>acc[1]</code></font> : y-axis (from left to right, positive towards right)
* <font style="color:red"><code>acc[2]</code></font> : z-axis (from bottom to top, positive upward)
The values in this array vary from -128 to 128, with 1 g ([https://en.wikipedia.org/wiki/Earth%27s_gravity the acceleration of the earth's gravity]) corresponding to the value 64. Elisa-3 generates the <font style="color:green"><code>acc</code></font> event after every update.<br/>
The z-axis is used also to know the current orientation of the robot, that is if it is moving vertically or horizontally; the current orientation can be accessed using the function <font style="color:blue"><code>robot.isVertical(dest)</code></font>, where <code>dest</code> will be 1 if it is vertical or 0 if it is horizontal.


==Communicate with the ToF sensors==
==Selector==
In order to communicate with the sensors you can use the <code>multiple-i2c-bus-support</code> branch of the [https://github.com/pimoroni/vl53l1x-python vl53l1x-python] library from [https://shop.pimoroni.com/ Pimoroni]. To install this library follow these steps:
The variable <font style="color:red"><code>selector</code></font> ''[R]'' shows the current position of the selector (from 0 to 15). Elisa-3 generates the <font style="color:green"><code>sel</code></font> event everytime its position is changed.
# <code>git clone -b multiple-i2c-bus-support https://github.com/pimoroni/vl53l1x-python.git</code>
# <code>cd vl53l1x-python</code>
# <code>sudo python3 setup.py install</code>


A Python example showing how to read data from the ToF sensors is available in the Pi-puck repository of the York Robotics Lab:
==Remote control==
# <code> git clone https://github.com/yorkrobotlab/pi-puck.git pi-puck_yrl</code>
Elisa-3 contains a receiver for infrared remote controls compatible with the RC5 protocol. When Elisa-3 receives an RC5 code, it generates the <font style="color:green"><code>rc5</code></font> event. In this case, the variables <font style="color:red"><code>rc5</code></font> ''[R]'' is updated.
# <code>cd pi-puck_yrl/python-library</code>
# <code>python3 pipuck-library-test.py -t</code>


=Ultra Wide Band extension for Pi-Puck=
==Battery==
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/uwb-top.jpg <img width=200 src="https://projects.gctronic.com/epuck2/wiki_images/uwb-top.jpg">]</span>
The variable <font style="color:red"><code>bat.percent</code></font> ''[R]'' give you an estimate of the current battery charge given in percentage (100% means you have still a lot of playful time, 0% means you need to wait a little and put the robot in charge). The sampled value can be accessed with the variable <font style="color:red"><code>_bat.adc</code></font> ''[R]'' (this is an hidden variable); the values range from 0 to 1023.
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/uwb-front.jpg <img width=200 src="https://projects.gctronic.com/epuck2/wiki_images/uwb-front.jpg">]</span><br/>


The Ultra Wide Band extension is connected to the Pi-Puck extension through the extension connector pins (2x10). The SPI channel is used for the communication between the Raspberry Pi and the Ultra Wide Band module so that the user is able to configure the module and receive position information from the RPi OS.<br/>
==Local communication==
The following [https://projects.gctronic.com/epuck2/uwb/DRTLS_Manager_R2.apk DRTLS_Manager_R2.apk] can also be used to configure the Ultra Wide Band extension.<br/>
Elisa-3 can use its infrared proximity sensors to communicate with other robots within a range of about 5 cm. For more detailed information refer to section [https://www.gctronic.com/doc/index.php/Elisa-3#Local_communication Local communication].<br/>
To use the communication, call the <font style="color:blue"><code>prox.comm.enable(state)</code></font> function, with 1 in state to enable communication or 0 to turn it off. If the communication is enabled, the value in the <font style="color:red"><code>prox.comm.tx</code></font> ''[W]'' variable is transmitted to others robots from all the sensors. When Elisa-3 receives a value, the event <font style="color:green"><code>prox.comm</code></font> is fired and the value is in the <font style="color:red"><code>prox.comm.rx</code></font> ''[R]'' variable; moreover the <font style="color:red"><code>prox.comm.rx.id</code></font> ''[R]'' variable contains the id of the sensors (from 0 to 7, where 0 is the front sensor, sensors id increases clockwise) that received the data.


To get started you need 4 <code>ANCHORS</code> to delimit your arena (see following figure); they are standalone and the Ultra Wide Band extension is equipped with a standard battery connector for easy recharging.
==Odometry==
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/uwb-anchor-front.jpeg <img width=200 src="https://projects.gctronic.com/epuck2/wiki_images/uwb-anchor-front.jpeg">]</span><br/>
Elisa-3 is capable of estimating how much distance has traveled each wheel resulting in a robot position given in cartesian coordinates (x, y); when moving horizontally the orientation is estimated through the distance traveled by each wheel, instead when moving vertically (what?? vertically?? yes Elisa-3 can move vertically thanks to its magnetic wheels) the orientation is given directly by the accelerometer and it's very precise.<br/>
Then you can equip your robots with the Ultra Wide Band extension and by configuring them as <code>TAG</code> you will receive their position inside the arena. <br/>
The variable <font style="color:red"><code>odom.theta</code></font> ''[R]'' contains the current orientation of the robot given in degrees: when moving horizontally the orientation continuously decreases when moving clockwise and continuously increases when moving counter-clockwise; when moving vertically the orientation is from -180 to 180 degrees. The variables <font style="color:red"><code>odom.x</code></font> ''[R]'' and <font style="color:red"><code>odom.y</code></font> ''[R]'' contain the current position of the robot given in millimeters.<br/>
By calling the function <font style="color:blue"><code>reset.odometry</code></font> all the data are reset to zero.


During our test on a 2m x 2m arena with 4 anchors we get position (x, y) accuracy of the e-puck2 robot of about 7cm. Have a look at the following video:
==Timer==
{{#ev:youtube|RpJ8NqjytHM}}
Elisa-3 provides a user-defined timer. The variable <font style="color:red"><code>timer.period</code></font> ''[W]'' allows to specify the period of the timer in milliseconds. The timer starts the countdown when it is initialized (value > 0). When the period expires, the timer generates a <font style="color:green"><code>timer</code></font> event. This events is managed in the same way as all the others and cannot interrupt an already executing event handler. The maximum value is 32767 ms (about 32 seconds).


==Documentation==
==Onboard behaviors==
# [https://projects.gctronic.com/epuck2/uwb/DWM1001C_Datasheet.pdf DWM1001C_Datasheet.pdf]
Elisa-3 include two onboard behaviors that can be activated or deactivated at will that are obstacle avoidance and cliff detection. To use obstacle avoidance, call the <font style="color:blue"><code>behavior.oa.enable(state)</code></font> function, with 1 in state to enable obstacle avoidance or 0 to disable it; when activated the motors speed will be adapted in order to avoid obstacles. To use cliff detection, call the <font style="color:blue"><code>behavior.cliff.enable(state)</code></font> function, with 1 in state to enable cliff detection or 0 to disable it; when activated the Elisa-3 will stop as soon as it detect a cliff (pay attention that the robot can detect the void only when going forward).
# [https://projects.gctronic.com/epuck2/uwb/DWM1001-Firmware-User-Guide.pdf DWM1001-Firmware-User-Guide.pdf]
# [https://projects.gctronic.com/epuck2/uwb/DWM1001-API-Guide.pdf DWM1001-API-Guide.pdf]

Revision as of 07:08, 21 February 2024

Introduction

Aseba is a set of tools which allow novices to program robots easily and efficiently, refer to https://www.thymio.org/en:start for more information.

Prerequisites

The following steps neeed to be done only once:

  1. The communication between Aseba and Elisa-3 is done through the USB (serial communication is used) cable so you need to install the driver, refer to section Elisa-3 requirements; in the future we will maybe add RF support too
  2. Download and install Aseba version 1.5.5
  3. Download the Elisa-3 target for Aseba elisa3-aseba.hex and upload it to the robot (refer to section Elisa-3 Programming)

Connection with AsebaStudio

The following steps explain how to start playing with the Aseba Studio:
1. Connect the robot to the computer if not already done and turn it on
2. Download the following script based on your platform and modify its content specifying the AsebaStudio installation folder and the robot port:

  • Windows: asebaswitch_elisa3.bat; spcifiy the installation folder (e.g. C:\Program Files (x86)\AsebaStudio) and the port number (e.g. 10 for COM10)
  • Linux / Mac OS: asebaswitch_elisa3.sh; specifiy the installation folder (e.g. /usr/bin in Linux or /Applications/Aseba/bin in Mac OS) and the port (e.g. /dev/ttyUSB0 in Linux or /dev/cu.usbserial-XXXXX in Mac OS)

3. Start the script:

  • Windows: double click on the bat file
  • Linux / Mac OS: set the script to be executable with the command chmod +x asebaswitch_elisa3.sh and then execute it ./asebaswitch_elisa.sh

4. Start AsebaStudio and select Network(TCP), insert localhost as Host and specify 33333 as Port to open the connection with the robot
5. If the connection is correctly established you should see the Elisa-3 variables on the left side of AsebaStudio as shown in the following figure:

Have a look also at the following video (you must use the script instead of manually issueing the command as in the video):

Simple test

Once the connection is opened click on the checkbox auto to start updating the sensors data automatically; we can now interact with the robot, for instance on the left side we can see all the sensors values (proximity, ground, accelerometer, ...) and we can change the motor speed and turn on/off all the leds.

Software

First of all hava a look at some of the examples proposed in the following section AsebaStudio examples.
Then when you're ready you can start programming the robot on your own, refer to section Programming interface; moreover you can have a look at https://www.thymio.org/en:start for more information.
Pay attention that you have 100 bytes availables for your script due to memory constraints.
If you want to have a look behind the scene refer to section Contribute to the Elisa-3 Aseba target.

AsebaStudio examples

You can download all the following examples from aseba-elisa3-examples.zip; in order to launch an example follow these steps:

  1. place the robot selector in position 5. When the robot is turned on, the position of the selector (from 0 to 9) define the node name in AsebaStudio, in our case the node name will be elisa3-5 (where 5 is the selector position)
  2. extract the zip, the directory contains some file with aesl extension, this is the AsebaStudio code extension
  3. connect the robot with AsebaStudio as explained previously
  4. click on menu File => Open... and select one of the examples extracted from the zip
  5. click on button Load and then on Run; now the code is running on the robot but it isn't stored in EEPROM, thus when you turn off the robot it returns to its initial state

If you want to save the program in memory you need to click on Tools => Write the program(s)... => inside the elisa3 and wait for the programming termination (the green leds around the robot will be turned on while the memory is written); pay attention to uncheck the auto update of the robot variables in AsebaStudio before starting the writing (with the auto update enabled the writing could block).

Simple obstacle avoidance

var i = 1
while (i==1) do
	if ((prox[0] > 50) or (prox[1] > 50) or (prox[7] > 50)) then
		mot.left.target = -20
		mot.right.target = 20			
	else
		mot.left.target = 20
		mot.right.target = 20		
	end
end

To fully see the results of this example you need to write the code into the robot. Then let it move with some objects around.

RGB control

onevent ir.sensors
	if (prox[0] > 20) then	# avoid noise
		led.rgb[0] = prox[0]
	else
		led.rgb[0] = 0	
	end
	
	if (prox[1] > 20) then	# avoid noise
		led.rgb[1] = prox[1]
	else
		led.rgb[1] = 0	
	end
	
	if (prox[7] > 20) then	# avoid noise
		led.rgb[2] = prox[7]
	else
		led.rgb[2] = 0	
	end	

Once the code is loaded on the robot you can "control" the intensity of the red, green and blue with the prox[0], prox[1] and prox[7] respectively. Try to get the primary and secondary colors (https://en.wikipedia.org/wiki/Primary_color)...hint: you need two fingers.

Working with events

var color = 0

onevent ir.sensors
	led.green[0] = 1 - led.green[0]
	led.green[2] = 1 - led.green[2]
	led.green[4] = 1 - led.green[4]
	led.green[6] = 1 - led.green[6]

onevent acc
	led.green[1] = 1 - led.green[1]
	led.green[3] = 1 - led.green[3]
	led.green[5] = 1 - led.green[5]
	led.green[7] = 1 - led.green[7]

onevent timer
	led.rgb[color] = 255 - led.rgb[color]
	
onevent button
	if (color == 2) then
		color = 0
	else
		color++	
	end
	led.rgb[0] = 0
	led.rgb[1] = 0
	led.rgb[2] = 0

The green leds shows the update frequency of the proximity and accelerometer sensors (you can measure it with an oscilloscope if you have one). You can change the value of the variable timer.period to change the frequency of the RGB LED, the resolution is 1 ms (e.g. by putting 1000 you'll get the RGB LED blinking at 1 Hz). Moreover you can try pressing the button and see what happen (probably you will deduce from the code...).

Remote control

onevent rc5
	if rc5 == 2 then	# forward
		mot.left.target = 20
		mot.right.target = 20
	elseif rc5 == 5 then	# stop
		mot.left.target = 0
		mot.right.target = 0
	elseif rc5 == 8 then	# backward
		mot.left.target = -20
		mot.right.target = -20
	elseif rc5 == 4 then	# left
		mot.left.target = 0
		mot.right.target = 20
	elseif rc5 == 6 then	# right
		mot.left.target = 20
		mot.right.target = 0
	else	# error
		led.rgb[0] = 255
	end

To fully see the results of this example you need to write the code into the robot. Maybe you should adapt the values used for the various motions and then you can for sure extend the functionalities using others codes (e.g. change RGB LED color).

Simple local communication

In this example we need to connect two robots at the same time to AsebaStudio, to accomplish this asebaswitch need to be called with a different command that is:
asebaswitch -d -v "ser:port=104;baud=57600;stop=1;parity=none;fc=none;bits=8" "ser:port=69;baud=57600;stop=1;parity=none;fc=none;bits=8"
Basically there are two targets instead of one; you need to specify the correct port number for both the robots. Moreover you need to place the robot receiver selector in position 5 and the robot transmitter selector to another position (from 0 to 9). Both the robots will blink if the connection is correctly opened.
Load the following code to the receiver robot:

call prox.comm.enable(1)
onevent prox.comm
	led.green[0] = 0
	led.green[1] = 0
	led.green[2] = 0
	led.green[3] = 0
	led.green[4] = 0
	led.green[5] = 0
	led.green[6] = 0
	led.green[7] = 0
	led.green[prox.comm.rx.id] = 1
	if (prox.comm.rx == 1) then
		led.rgb[0] = 255
		led.rgb[1] = 0
		led.rgb[2] = 0
	elseif (prox.comm.rx == 2) then
		led.rgb[0] = 0
		led.rgb[1] = 255
		led.rgb[2] = 0
	elseif (prox.comm.rx == 3) then
		led.rgb[0] = 0
		led.rgb[1] = 0
		led.rgb[2] = 255
	end

Load the following line code to the transmitter robot:

call prox.comm.enable(1)

Now you can change the prox.comm.tx values (from 1 to 3) on the transmitter tab and see the effect on the receiver robot; also prox.comm.rx and prox.comm.rx.id on the receiver tab will change accordingly. You can easily transform the id to an angle by knowing that each sensor is placed at 45 degrees from each other. Remember to place the robots near each other (< 5 cm).

Square path

In this example we exploit the onboard odometry to let the robot move in a square path. You have two possibilities: either running on a vertical wall or running on an horizontal plane, both cases are handled automatically. When the Elisa-3 is turned on, it calibrates the sensors (when is placed vertically it rotates around itself for a while with the green led turned on); when the calibration process is finished you can start the square by "touching" the back side proximity. Pay attention that the robot must be placed with the front pointing right when placed vertically.

var DISTANCE = 100	# given in mm
var start = 0
var state = 0
var isVertical
sub updateState
	if start == 1 then
		if (state == 0) then
			mot.left.target = 20
			mot.right.target = 20
			led.rgb[0] = 255
			led.rgb[1] = 0
			led.rgb[2] = 0
			if (odom.x >= DISTANCE) then
				state = 1
			end		
		elseif (state == 1) then
			mot.left.target = 0
			mot.right.target = 15
			led.rgb[0] = 0
			led.rgb[1] = 255
			led.rgb[2] = 0
			if (odom.theta >= 90) then
				state = 2
			end		
		elseif (state == 2) then
			mot.left.target = 20
			mot.right.target = 20
			led.rgb[0] = 0
			led.rgb[1] = 0
			led.rgb[2] = 255
			if (odom.y >= DISTANCE) then
				state = 3
			end	
		elseif (state == 3) then
			mot.left.target = 0
			mot.right.target = 20
			led.rgb[0] = 255
			led.rgb[1] = 255
			led.rgb[2] = 0
			call robot.isVertical(isVertical)
			if (isVertical == 1) then
				if (odom.theta < 0) then
					state = 4
				end					
			else
				if (odom.theta >= 180) then
					state = 4
				end					
			end				
		elseif (state == 4) then
			mot.left.target = 20
			mot.right.target = 20
			led.rgb[0] = 255
			led.rgb[1] = 0
			led.rgb[2] = 255
			if (odom.x <= 0) then
				state = 5
			end		
		elseif (state == 5) then
			mot.left.target = 0
			mot.right.target = 20
			led.rgb[0] = 0
			led.rgb[1] = 255
			led.rgb[2] = 255
			call robot.isVertical(isVertical)
			if (isVertical == 1) then
				if ((odom.theta >= -90) and (odom.theta < 0) ) then
					state = 6
				end				
			else
				if (odom.theta >= 270) then
					state = 6
				end				
			end				
		elseif (state == 6) then
			mot.left.target = 20
			mot.right.target = 20
			led.rgb[0] = 0
			led.rgb[1] = 0
			led.rgb[2] = 0
			if (odom.y <= 0) then
				state = 7
			end
		elseif (state == 7) then
			mot.left.target = 0
			mot.right.target = 20
			led.rgb[0] = 0
			led.rgb[1] = 0
			led.rgb[2] = 0
			call robot.isVertical(isVertical)
			if (isVertical == 1) then
				if (odom.theta >= 0) then
					state = 8
				end				
			else
				if (odom.theta >= 360) then
					state = 8
				end				
			end			
		elseif (state == 8) then
			mot.left.target = 0
			mot.right.target = 0
			start = 0										
		end			
	end

onevent ir.sensors
	if (start == 0) then
		if (prox[4] > 200) then
			call reset.odometry()
			state = 0
			start = 1	
		end		
	end
	callsub updateState

Contribute to the Elisa-3 Aseba target

You can get the source code of the Elisa-3 Aseba target from github.
The repo contains all Arduino targets, the Elisa-3 target is placed in the directory elisa3. To build the project follow these steps:

  1. clone the repo by issueing the command: git clone --recursive https://github.com/gctronic/aseba-targets-arduino.git
  2. download Atmel Studio 7 since this IDE was used to create the project; the installation of Atmel Studio includes also the toolchain so you should be able to build the project without any modification
  3. to open the project double click elisa3-aseba.atsln

Dependencies

The project depends on some files of Aseba that is included as submodule in the aseba-targets-arduino repo to simplify the building. The files referenced from the project are:

  • aseba\trunk\vm\natives.c
  • aseba\trunk\vm\natives.h
  • aseba\trunk\vm\vm.c
  • aseba\trunk\vm\vm.h
  • aseba\trunk\transport\buffer\vm-buffer.c
  • aseba\trunk\transport\buffer\vm-buffer.h

The project depends also on the Elisa-3 library contained in the Elisa-3 advanced firmware revision 221.

Programming interface

This page describes the programming capabilities of Elisa-3. It lists the different variables, events and functions and indicates to which elements of the robot they refer (see section Hardware to know where is the actual position on the robot of the sensors and actuators). Each variable is marked with either [R] or [W] to indicate whether the variable is used to read a value from the robot or write a value to the robot respectively. This page refers to firmware revision 0 and later.
You can find a document that summarizes the programming interface in the following link elisa3-aseba-cheatsheet.png.

Standard library

The Elisa-3 comes with the Aseba standard library of native functions, documented on its own page.

Motors

You can change the wheel speeds by writing in these variables:

  • motor.left.target [W]: requested speed for left wheel
  • motor.right.target [W]: requested speed for right wheel

You can read the real wheel speeds from these variables:

  • motor.left.speed [R]: real speed of left wheel
  • motor.right.speed [R]: real speed of right wheel

The values range from -127 to 127, one unit = 5 mm/s. A value of 127 approximately corresponds to a linear speed of 60 cm/s.

Green LEDs

8 green LEDs make up a circle on the bottom of the robot.
led.green[0..7] [W]: index 0 sets the intensity of the LED at the front of the robot, the others are numbered clockwise.

RGB LED

There is one RGB LED in the center of the robot, its light is smoothly spread out through the top diffuser.
led.rgb[0..2] [W]: the indexes 0, 1 and 2 set respectively the intensity of the red, green and blue.
The values range from 0 (off) to 255 (max intensity).

IR transmitters

There are 3 IR transmitters pointing upwards, two placed in the front side of the robot and one placed in the back side. You can control their state by writing these variables:

  • ir.tx.front [W]: 0 means that both front IRs are turned off, 1 means that both front IRs are turned on
  • ir.tx.back [W]: 0 means that the back IR is turned off, 1 means that the back IR is turned off

Button

There is a small button in the back side of the Elisa-3. The variable button [R] holds the state of this button (1 = released, 0 = pressed).
Elisa-3 generates the button event when it is pressed or released.

Proximity sensors

Elisa-3 has 8 proximity sensors around its periphery (placed at 45 degrees from each other). Two arrays of 8 variables hold the values of these sensors, the first is prox [R] and represents the proximity to an object, the second is prox.ambient [R] and represents the ambient light intensity:

  • prox[0], prox.ambient[0] : front
  • prox[1], prox.ambient[1] : front right
  • prox[2], prox.ambient[2] : right
  • prox[3], prox.ambient[3] : back right
  • prox[4], prox.ambient[4] : back
  • prox[5], prox.ambient[5] : back left
  • prox[6], prox.ambient[6] : left
  • prox[7], prox.ambient[7] : front left

The values in the prox array vary from 0 (the robot does not see anything) to 255 (the robot is very close to an obstacle); the values of the prox.ambient array start from 1023 when completely dark and decrease with light increase. Elisa-3 updates these arrays at a frequency of about 80 Hz (when local communication is disabled), and generates the ir.sensors event after every update.

Ground sensors

Elisa-3 holds 4 ground proximity sensors. These sensors are located at the front of the robot. As black grounds appear like no ground at all (black absorbs the infrared light), these sensors can be used to follow a line on the ground and also to avoid falling from the table. Two arrays of 4 variables hold the values of these sensors, the first is ground [R] and represents the proximity to the ground or the presence of a black line, the second is ground.ambient [R] and represents the ambient light intensity at the ground:

  • ground[0], ground.ambient[0] : left
  • ground[1], ground.ambient[1] : front left
  • ground[2], ground.ambient[2] : front right
  • ground[3], ground.ambient[3] : right

The values in the ground array normally vary from about 600 (white surface) to about 300 (black surface or no ground); the values of the prox.ambient array start from 1023 when completely dark and decrease with light increase. Elisa-3 updates these arrays at a frequency of about 80 Hz (when local communication is disabled), and generates the same ir.sensors event after every update.

Accelerometer

Elisa-3 contains a 3-axes accelerometer. An array of 3 variables, acc [R], holds the values of the acceleration along these 3 axes:

  • acc[0] : x-axis (from back to front, positive forward)
  • acc[1] : y-axis (from left to right, positive towards right)
  • acc[2] : z-axis (from bottom to top, positive upward)

The values in this array vary from -128 to 128, with 1 g (the acceleration of the earth's gravity) corresponding to the value 64. Elisa-3 generates the acc event after every update.
The z-axis is used also to know the current orientation of the robot, that is if it is moving vertically or horizontally; the current orientation can be accessed using the function robot.isVertical(dest), where dest will be 1 if it is vertical or 0 if it is horizontal.

Selector

The variable selector [R] shows the current position of the selector (from 0 to 15). Elisa-3 generates the sel event everytime its position is changed.

Remote control

Elisa-3 contains a receiver for infrared remote controls compatible with the RC5 protocol. When Elisa-3 receives an RC5 code, it generates the rc5 event. In this case, the variables rc5 [R] is updated.

Battery

The variable bat.percent [R] give you an estimate of the current battery charge given in percentage (100% means you have still a lot of playful time, 0% means you need to wait a little and put the robot in charge). The sampled value can be accessed with the variable _bat.adc [R] (this is an hidden variable); the values range from 0 to 1023.

Local communication

Elisa-3 can use its infrared proximity sensors to communicate with other robots within a range of about 5 cm. For more detailed information refer to section Local communication.
To use the communication, call the prox.comm.enable(state) function, with 1 in state to enable communication or 0 to turn it off. If the communication is enabled, the value in the prox.comm.tx [W] variable is transmitted to others robots from all the sensors. When Elisa-3 receives a value, the event prox.comm is fired and the value is in the prox.comm.rx [R] variable; moreover the prox.comm.rx.id [R] variable contains the id of the sensors (from 0 to 7, where 0 is the front sensor, sensors id increases clockwise) that received the data.

Odometry

Elisa-3 is capable of estimating how much distance has traveled each wheel resulting in a robot position given in cartesian coordinates (x, y); when moving horizontally the orientation is estimated through the distance traveled by each wheel, instead when moving vertically (what?? vertically?? yes Elisa-3 can move vertically thanks to its magnetic wheels) the orientation is given directly by the accelerometer and it's very precise.
The variable odom.theta [R] contains the current orientation of the robot given in degrees: when moving horizontally the orientation continuously decreases when moving clockwise and continuously increases when moving counter-clockwise; when moving vertically the orientation is from -180 to 180 degrees. The variables odom.x [R] and odom.y [R] contain the current position of the robot given in millimeters.
By calling the function reset.odometry all the data are reset to zero.

Timer

Elisa-3 provides a user-defined timer. The variable timer.period [W] allows to specify the period of the timer in milliseconds. The timer starts the countdown when it is initialized (value > 0). When the period expires, the timer generates a timer event. This events is managed in the same way as all the others and cannot interrupt an already executing event handler. The maximum value is 32767 ms (about 32 seconds).

Onboard behaviors

Elisa-3 include two onboard behaviors that can be activated or deactivated at will that are obstacle avoidance and cliff detection. To use obstacle avoidance, call the behavior.oa.enable(state) function, with 1 in state to enable obstacle avoidance or 0 to disable it; when activated the motors speed will be adapted in order to avoid obstacles. To use cliff detection, call the behavior.cliff.enable(state) function, with 1 in state to enable cliff detection or 0 to disable it; when activated the Elisa-3 will stop as soon as it detect a cliff (pay attention that the robot can detect the void only when going forward).