Hardware
Hardware used
This is the hardware I’m using to create the cluster:
Raspberry Pi nodes
- 4 x Raspberry Pi 4 - Model B (4 GB) and 1 x Raspberry Pi 4 - Model B (8 GB) for the kubernetes cluster (1 master node and 5 worker nodes).
- 1 x Raspberry Pi 4 - Model B (2 GB) for creating a router for the lab environment connected via wifi to my home network and securing the access to my lab network.
- 4 x SanDisk Ultra 32 GB microSDHC Memory Cards (Class 10) for installing Raspberry Pi OS for enabling booting from USB (update Raspberry PI firmware and modify USB partition)
- 4 x Samsung USB 3.1 32 GB Fit Plus Flash Disk
- 1 x Kingston A400 SSD Disk 480GB
- 4 x Kingston A400 SSD Disk 240GB
- 5 x Startech USB 3.0 to SATA III Adapter for connecting SSD disk to USB 3.0 ports.
- 1 x GeeekPi Pi Rack Case. It comes with a stack for 4 x Raspberry Pi’s, plus heatsinks and fans)
- 1 x SSD Rack Case
- 1 x ANIDEES AI CHARGER 6+. 6 port USB power supply (60 W and max 12 A)
- 1 x ANKER USB Charging Hub . 6 port USB power supply (60 w and max 12 A)
- 6 x USB-C charging cable with ON/OFF switch.
x86 nodes
- 2 x HP EliteDesk 800 G3 i5 6500T 2,5 GHz, 8 GB de RAM, SSD de 256 GB for additional cluster nodes.
One of the nodes
node-hp-2
has a SSD M.2 NVMe 256 GB. The other,node-hp-1
has a SATA SSD Kingston 240 GB - 2 x Crucial RAM 8GB DDR4 2400MHz CL17 Memoria as RAM expansion for mini PCs. Total memmory 16 GB
Networking
- 1 x Negear GS108-300PES. 8 ports GE ethernet manageable switch (QoS and VLAN support)
- 8 x Ethernet Cable. Flat Cat 6, 15 cm length
Raspberry PI Storage benchmarking
Different Raspberry PI storage configurations have been tested:
-
Internal SDCard: SanDisk Ultra 32 GB microSDHC Memory Cards (Class 10)
-
Flash Disk USB 3.0: Samsung USB 3.1 32 GB Fit Plus Flash Disk
-
SSD Disk Kingston A400 480GB + USB3 to SATA Adapter Startech USB 3.0 to SATA III
-
iSCSI Volumes. Using another Raspberry PI as storage server, configured as iSCSI Target, using a SSD disk attached.
Testing procedure
Sequential and random I/O tests have been executed with the different storage configurations.
For the testing a tweaked version of the script provided by James A. Chambers (https://jamesachambers.com/) has been used
Tests execution has been automated with Ansible. See pi-storage-benchmark
repository for the details of the testing procedure and the results.
Sequential I/O performance
Test sequential I/O with dd
and hdparam
tools. hdparm
can be installed through sudo apt install -y hdparm
-
Read speed (Use
hdparm
command)sudo hdparm -t /dev/sda1 Timing buffered disk reads: 72 MB in 3.05 seconds = 23.59 MB/sec sudo hdparm -T /dev/sda1 Timing cached reads: 464 MB in 2.01 seconds = 231.31 MB/sec
It can be combined in just one command:
sudo hdparm -tT --direct /dev/sda1 Timing O_DIRECT cached reads: 724 MB in 2.00 seconds = 361.84 MB/sec Timing O_DIRECT disk reads: 406 MB in 3.01 seconds = 134.99 MB/sec
-
Write Speed (use
dd
command)sudo dd if=/dev/zero of=test bs=4k count=80k conv=fsync 81920+0 records in 81920+0 records out 335544320 bytes (336 MB, 320 MiB) copied, 1,86384 s, 180 MB/s
Random I/O Performance
Tools used fio
and iozone
.
-
Install required packages with:
sudo apt install iozone3 fio
-
Check random I/O with
fio
Random Write
sudo fio --minimal --randrepeat=1 --ioengine=libaio --direct=1 --gtod_reduce=1 --name=test --filename=test --bs=4k --iodepth=64 --size=80M --readwrite=randwrite
Random Read
sudo fio --minimal --randrepeat=1 --ioengine=libaio --direct=1 --gtod_reduce=1 --name=test --filename=test --bs=4k --iodepth=64 --size=80M --readwrite=randread
-
Check random I/O with
iozone
sudo iozone -a -e -I -i 0 -i 1 -i 2 -s 80M -r 4k
Performance Results
Average-metrics obtained during the tests removing the worst and the best result can be found in the next table and the following graphs:
Disk Read (MB/s) | Cache Disk Read (MB/s) | Disk Write (MB/s) | 4K Random Read (IOPS) | 4K Random Read (KB/s) | 4K Random Write (IOPS) | 4K Random Write (KB/s) | 4k read (KB/s) | 4k write (KB/s) | 4k random read (KB/s) | 4k random write (KB/s) | Global Score | |
---|---|---|---|---|---|---|---|---|---|---|---|---|
SDCard | 41.89 | 39.02 | 19.23 | 2767.33 | 11071.00 | 974.33 | 3899.33 | 8846.33 | 2230.33 | 7368.67 | 3442.33 | 1169.67 |
FlashDisk | 55.39 | 50.51 | 21.30 | 3168.40 | 12675.00 | 2700.20 | 10802.40 | 14842.20 | 11561.80 | 11429.60 | 10780.60 | 2413.60 |
SSD | 335.10 | 304.67 | 125.67 | 22025.67 | 88103.33 | 18731.33 | 74927.00 | 31834.33 | 26213.33 | 17064.33 | 29884.00 | 8295.67 |
iSCSI | 70.99 | 71.46 | 54.07 | 5104.00 | 20417.00 | 5349.67 | 21400.00 | 7954.33 | 7421.33 | 6177.00 | 7788.33 | 2473.00 |
-
Sequential I/O
-
Random I/O (FIO)
-
Random I/O (IOZONE)
-
Global Score
Conclusions:
- Clearly
SSD
with USB3.0 to SATA adapter beats the rest in all performance tests. SDCard
obtains worst metrics thanFlashDisk
andiSCSI
FlashDisk
andiSCSI
get similar performance metrics
The performace obtained using local attached USB3.0 Flash Disk is quite similar to the one obtained using iSCSI with RaspberryPI+SSD Disk as central storage.