diff --git a/content/hardware/05.nicla/boards/nicla-vision/compatibility.yml b/content/hardware/05.nicla/boards/nicla-vision/compatibility.yml new file mode 100644 index 0000000000..3af9b19ae2 --- /dev/null +++ b/content/hardware/05.nicla/boards/nicla-vision/compatibility.yml @@ -0,0 +1,6 @@ +software: + - arduino-ide + - arduino-cli + - iot-cloud + - web-editor + - openmv-ide \ No newline at end of file diff --git a/content/hardware/05.nicla/boards/nicla-vision/datasheet/assets/featured.png b/content/hardware/05.nicla/boards/nicla-vision/datasheet/assets/featured.png new file mode 100644 index 0000000000..d6b383a810 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/datasheet/assets/featured.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/datasheet/assets/niclaVisionBackTopology.svg b/content/hardware/05.nicla/boards/nicla-vision/datasheet/assets/niclaVisionBackTopology.svg new file mode 100644 index 0000000000..1a1b26ece4 --- /dev/null +++ b/content/hardware/05.nicla/boards/nicla-vision/datasheet/assets/niclaVisionBackTopology.svg @@ -0,0 +1,704 @@ + + + + + +Created by potrace 1.16, written by Peter Selinger 2001-2019 + + + image/svg+xml + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/content/hardware/05.nicla/boards/nicla-vision/datasheet/assets/niclaVisionBlockDiagram.svg b/content/hardware/05.nicla/boards/nicla-vision/datasheet/assets/niclaVisionBlockDiagram.svg new file mode 100644 index 0000000000..3248e9abc6 --- /dev/null +++ b/content/hardware/05.nicla/boards/nicla-vision/datasheet/assets/niclaVisionBlockDiagram.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/content/hardware/05.nicla/boards/nicla-vision/datasheet/assets/niclaVisionMech.svg b/content/hardware/05.nicla/boards/nicla-vision/datasheet/assets/niclaVisionMech.svg new file mode 100644 index 0000000000..f658416c68 --- /dev/null +++ b/content/hardware/05.nicla/boards/nicla-vision/datasheet/assets/niclaVisionMech.svg @@ -0,0 +1,392 @@ + + + + + +Created by potrace 1.16, written by Peter Selinger 2001-2019 + + + image/svg+xml + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/content/hardware/05.nicla/boards/nicla-vision/datasheet/assets/niclaVisionPowerTree.svg b/content/hardware/05.nicla/boards/nicla-vision/datasheet/assets/niclaVisionPowerTree.svg new file mode 100644 index 0000000000..0f0355d83c --- /dev/null +++ b/content/hardware/05.nicla/boards/nicla-vision/datasheet/assets/niclaVisionPowerTree.svg @@ -0,0 +1,243 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/content/hardware/05.nicla/boards/nicla-vision/datasheet/assets/niclaVisionSolutionOverview.png b/content/hardware/05.nicla/boards/nicla-vision/datasheet/assets/niclaVisionSolutionOverview.png new file mode 100644 index 0000000000..99181cb898 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/datasheet/assets/niclaVisionSolutionOverview.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/datasheet/assets/niclaVisionTopTopology.svg b/content/hardware/05.nicla/boards/nicla-vision/datasheet/assets/niclaVisionTopTopology.svg new file mode 100644 index 0000000000..d4517a6f52 --- /dev/null +++ b/content/hardware/05.nicla/boards/nicla-vision/datasheet/assets/niclaVisionTopTopology.svg @@ -0,0 +1,704 @@ + + + + + +Created by potrace 1.16, written by Peter Selinger 2001-2019 + + + image/svg+xml + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/content/hardware/05.nicla/boards/nicla-vision/datasheet/datasheet.md b/content/hardware/05.nicla/boards/nicla-vision/datasheet/datasheet.md new file mode 100644 index 0000000000..6cbe63853c --- /dev/null +++ b/content/hardware/05.nicla/boards/nicla-vision/datasheet/datasheet.md @@ -0,0 +1,348 @@ +--- +identifier: ABX00051 +title: ArduinoÂź Nicla Vision +type: pro +author: Ali Jahangiri +--- + +![Nicla Vision](assets/featured.png) + +# Description +The **ArduinoÂź Nicla Vision** packs machine vision capabilities on the edge into a tiny fingerprint. Record, analyse and upload to the cloud all with the help of one **ArduinoÂź Nicla Vision**. Leverage the onboard camera, STM32 microcontroller, Wi-Fi/BluetoothÂź module and 6-axis IMU to create your own wireless sensor network for machine vision applications. + +# Target areas: +wireless sensor networks, data fusion, artificial intelligence, machine vision + +# Features +- **STM32H747AII6** Microcontroller + - Dual core + - 32-bit ArmÂź CortexÂź-M7 core with double-precision FPU and L1 cache up to 480 MHz + - 32-bit ArmÂź 32-bit CortexÂź-M4 core with FPU up to 240MHz + - Full set of DSP instructions + - Memory Protection Unit (MPU) +- **MurataÂź 1DX** Wi-Fi/BT Module + - Wi-Fi 802.11b/g/n 65 Mbps + - Bluetooth 4.2 BR/EDR/LE +- **MAX17262REWL+T** Fuel Gauge + - Implements ModelGauge m5 EZ for battery monitoring + - Low 5.2ÎŒA Operating Current + - No Calibration Required +- **NXP SE050C2** Crypto + - Common Criteria EAL 6+ certified up to OS level + - RSA & ECC functionalities, high key length and future proof curves, such as brainpool, Edwards, and Montgomery + - AES & 3DES encryption and decryption + - HMAC, CMAC, SHA-1, SHA-224/256/384/512 operations + - HKDF, MIFAREÂź KDF, PRF (TLS-PSK) + - Support of main TPM functionalities + - Secured flash user memory up to 50kB + - SCP03 (bus encryption and encrypted credential injection on applet and platform level) +- **VL53L1CBV0FY/1** Time-of-Flight Sensor + - Fully integrated miniature module + - 940 nm invisible laser (VCSEL) emitter + - Receiving array with integrated lens + - 400 cm+ detection with full field of view (FoV) +- **MP34DT06JTR** Microphone + - AOP = 122.5 dBSPL + - 64 dB signal-to-noise ratio + - Omnidirectional sensitivity + - –26 dBFS ± 1 dB sensitivity +- **GC2145** Camera + - 2 Megapixel CMOS Camera + - on-chip 10-bit ADC + - 1.75ÎŒm pixel size +- **LSM6DSOX** 6-axis IMU + - Always-on 3D accelerometer and 3D gyroscope + - Smart FIFO up to 4 KByte + - ±2/±4/±8/±16 g full scale + - ±125/±250/±500/±1000/±2000 dps full scale +- **USB3320C-EZK-TR** USB Transceiver + - Integrated ESD Protection circuit (up to ±15kV IEC Air Discharge) +- **AT25QL128A-UUE-T** 16 MB Flash +- **MC34PF1550A0EP** Power Management IC + +# Contents + +## Introduction + +### Application Examples +The **ArduinoÂź Nicla Vision** houses the computational power, camera, IMU you need to quickly developed machine vision solutions at the edge together with two wireless technologies. The board can act as a field-ready standalone board, or can be argumented with external peripherals through the I/O available on the chip. Ultra low power consumption and integrated battery management allows for deployment in various capabilities. WebBLE allows for easy OTA update to the firmware as well as remote monitoring. + +- **Warehouse & Automated Inventory Management**: +The **Arduino Nicla Vision** is capable of detecting packages as they come near its vicinity and wake up. These provides the benefits of a always-on camera, but with less power consumption. It can take pictures, predict volume/weight and also analyse for possible defects. Additionally, QR codes on the package can be tracked for automated pursuit of the package and relay of information to the cloud. + +- **Real-time process management**: +The **Arduino Nicla Vision** is equipped for Automated Optical Inspection (AOI) even in hard to reach and hazardous areas thanks to the small footprint and wireless connectivity options. The fast Time-of-Flight sensor ensures that the image acquisition is performed in a repeatable manner, with minimal modifications to the process. Additionally, the IMU can provide vibration analysis for predictive maintenance. + +- **Wireless Sensor Network Reference Design**: +The Nicla form factor has been specifically developed at ArduinoÂź as a standard for wireless sensor network which can be adapted by partners to develop custom-designed industrial solutions. Researchers and educators can use this platform to work on an industrially-recognized standard for wireless sensor research and development that can shorten the time from concept to market. + +### Accessories +- Single cell Li-ion/Li-Po battery + +### Related Products +- ArduinoÂź Portenta H7 (SKU: ABX00042) + +### Assembly Overview +![Example of a typical solution for remote machine vision including an ArduinoÂź Nicla Vision and LiPo battery.](assets/niclaVisionSolutionOverview.png) + +## Ratings +### Recommended Operating Conditions +| Symbol | Description | Min | Typ | Max | Unit | +| -------------------- | -------------------------------- | ------------------------ | --- | ------------------------ | ---- | +| VIN | Input voltage from VIN pad | 3.5 | 5.0 | 5.5 | V | +| VUSB | Input voltage from USB connector | 4.8 | 5.0 | 5.5 | V | +| VBATT | Input voltage from battery | 3.5 | 3.7 | 4.7 | V | +| VDDIO_EXT | Level Translator Voltage | 1.8 | 3.3 | 3.3 | V | +| VIH | Input high-level voltage | 0.7*VDDIO_EXT | | VDDIO_EXT | V | +| VIL | Input low-level voltage | 0 | | 0.3*VDDIO_EXT | V | +| TOP | Operating Temperature | -40 | 25 | 85 | °C | + +**Note 1:** VDDIO_EXT is software programmable. While the ADC inputs can accept up to 3.3V, the AREF value is at the STM32 operating voltage. + +**Note 2:** If the internal VDDIO_EXT is disabled, it is possible to supply it externally. + +### Power Consumption +| Symbol | Description | Min | Typ | Max | Unit | +| -------------------- | ----------------------------------------------------------- | --- | --- | --- | ---- | +| PSTDBY | Power consumption in standby | | TBC | | mW | +| PBLINK | Power consumption with blink sketch | | TBC | | mW | +| PSENSE | Power consumption for polling all sensors at 1 Hz | | TBC | | mW | +| PSENSE_LP | Low Power consumption for polling all sensors once per hour | | TBC | | mW | + +## Functional Overview + +### Block Diagram +![Nicla Vision Block Diagram](assets/niclaVisionBlockDiagram.svg) + +### Board Topology +**Top View** + +![Nicla Vision Top View](assets/niclaVisionTopTopology.svg) + +| **Ref.** | **Description** | **Ref.** | **Description** | +| -------- | ------------------------------------------------ | -------- | ----------------------------------------- | +| U1 | STM32H747AII6 Dual ARMÂź CortexÂź M7/M4 IC | U4 | VL53L1CBV0FY/1 Time-of-flight sensor IC | +| U5 | USB3320C-EZK-TR USB 2.0 Transceiver | U6 | MP34DT06JTR Omnidirectional Mic | +| U14 | DSC6151HI2B 25MHz MEMS Oscillator | U15 | DSC6151HI2B 27MHz MEMS Oscillator | +| U8 | IS31FL3194-CLS2-TR 3-channel LED IC | U9 | BQ25120AYFPR Battery Charger IC | +| U10 | SN74LVC1T45 1Channel voltage level translator IC | U11 | TXB0108YZPR Bidirectional IC | +| U12 | NTS0304EUKZ 4-bit translating transceiver | J1 | ADC, SPI and GPIO Pin headers | +| J2 | I2C, JTAG, Power and GPIO pin headers | J3 | Battery headers | +| DL1 | SMLP34RGB2W3 RGB SMD LED | DL2 | KPHHS-1005SURCK Red LED | +| PB1 | Reset button | J6 | U.FL-R-SMT-1(60) Male micro UFL connector | + +**Back View** +![Nicla Vision Back View](assets/niclaVisionBackTopology.svg) + +| **Ref.** | **Description** | **Ref.** | **Description** | +| -------- | ------------------------------------------- | -------- | -------------------------------------------------- | +| U2,U7 | LM66100DCKR Ideal Diode | U3 | LSM6DSOXTR 6-axis IMU with ML Core | +| U8 | SE050C2HQ1/Z01SDZ Crypto IC | U9 | LBEE5KL1DX-883 Wi-Fi/Bluetooth Module | +| U10 | MC34PF1550A0EP PMIC | U11 | TXB0108YZPR Bidirectional Voltage Shifter | +| U12 | NTS0304EUKZ Bidirectional Voltage Shifter | U13 | AT25QL128A-UUE-T 16MB FLASH Memory IC | +| U19 | MAX17262REWL+T Fuel Gauge IC | J4 | BM03B-ACHSS-GAN-TF(LF)(SN) 3-pin battery connector | +| J5 | SM05B-SRSS-TB(LF)(SN) 5-pin ESLOV connector | J7 | microUSB connector | + + +### Processor +H7's main processor is the dual core STM32H747 (U1) including a CortexÂź M7 running at 480 MHz and a CortexÂź M4 running at 240 MHz. The two cores communicate via a Remote Procedure Call mechanism that allows calling functions on the other processor seamlessly. + +### 6-Axis IMU +It is possible to obtain 3D gyroscope and 3D accelerometer data from the LSM6DSOX 6-axis IMU (U3). In addition to providing such data, it is also possible to do machine learning on the IMU for gesture detection, offloading computation load from the main processor. + +### Wi-Fi/Bluetooth Connectivity + The MurataÂź LBEE5KL1DX-883 wireless module (U9) simultaneously provides Wi-Fi and Bluetooth connectivity in an ultra small package based on the Cypress CYW4343W. The IEEE802.11 b/g/n Wi-Fi interface can be operated as an access point (AP), station (STA) or as a dual mode simultaneous AP/STA and supports a maximum transfer rate of 65 Mbps. Bluetooth interface supports Bluetooth Classic and BLE. An integrated antenna circuitry switch allows a single external antenna (J6) to be shared between Wi-Fi and Bluetooth. + +### Crypto Capabilities +The ArduinoÂź Nicla Vision enables IC level edge-to-cloud security capability through the NXP SE050C2 Crypto chip (U8). This provides Common Criteria EAL 6+ security certification up to OS level, as well as RSA/ECC cryptographic algorithm support and credential storage. + +### Time of Flight Sensor +The VL53L1CBV0FY Time-of-Flight sensor (U4) adds accurate and low power ranging capabilities to the ArduinoÂź Nicla Vision. The invisible near infrared VCSEL laser (including the analog driver) are encapsulated together with receiving optics in an all-in-one small module located below the camera. + +### Digital Microphones +The MP34DT05 digital MEMS microphone is omnidirectional and operate via a capacitive sensing element with a high (64 dB) signal to noise ratio. The sensing element, capable of detecting acoustic waves, is manufactured using a specialized silicon micromachining process dedicated to producing audio sensors (U6). + +### Power Tree +![Nicla Vision Power Tree](assets/niclaVisionPowerTree.svg) + +Input voltage can be provided to the Nicla Vision through the USB connector (J7), the ESLOV connector (J5), battery connector (J4) or alternatively the headers. The USB connector is prioritized over the ESLOV connector, both of which are prioritized over the battery connector and header. Reverse polarity protection for the USB connector (J7) and the ESLOV connector (J5) are provided by ideal diodes U2 and U7 respectively. Input voltage from the battery does NOT have reverse polarity protection and the user is responsible for respecting the polarity. + +A NTC (negative thermal coefficient) sensor provides overtemperature shutoff to the battery. The battery fuel gauge provides indication of the remaining battery capacity + +There are three main power lines provided: +- **+3V1** provides power to the microprocessor (U1), 25 MHz oscillator (U14), 32.768 oscillator (Y1), USB transceiver (U5) and Wi-Fi/Bluetooth module. +- **+2V8A** provides power to the camera (M1) and time-of-flight sensor (U4) +- **+1V8** provides power to the microprocessor (U1), camera (M1), USB transceiver (U5), Wi-Fi/Bluetooth module (U9), accelerometer (U3), microphone (U6), crypto (U8), FLASH (U13),27 MHz oscillator (U15) as well as the two level translators (U11,U12). + +- Additionally, a dedicated analog supply rail (VDDA) is provided for the microprocessor (U1). The camera module (M1) also has a dedicated power rail (+1V8CAM). + +## Board Operation +### Getting Started - IDE +If you want to program your ArduinoÂź Nicla Vision while offline you need to install the ArduinoÂź Desktop IDE **[1]** To connect the ArduinoÂź Vision to your computer, you’ll need a micro USB cable. This also provides power to the board, as indicated by the LED. + +### Getting Started - Arduino Web Editor +All ArduinoÂź boards, including this one, work out-of-the-box on the ArduinoÂź Web Editor **[2]**, by just installing a simple plugin. + +The ArduinoÂź Web Editor is hosted online, therefore it will always be up-to-date with the latest features and support for all boards. Follow **[3]** to start coding on the browser and upload your sketches onto your board. + +### Getting Started - Arduino Cloud +All ArduinoÂź IoT enabled products are supported on ArduinoÂź Cloud which allows you to log, graph and analyze sensor data, trigger events, and automate your home or business. + +### Getting Started - WebBLE +The Arduino Nicla Vision provides capability for OTA updates to the STM32 microcontroller using WebBLE. + +### Getting Started - ESLOV +This board can act as a secondary to a ESLOV controller and have the firmware updated through this method. +### Sample Sketches +Sample sketches for the ArduinoÂź Nicla Vision can be found either in the “Examples” menu in the ArduinoÂź IDE or on the ArduinoÂź documentation website **[4]** + +### Online Resources +Now that you have gone through the basics of what you can do with the board you can explore the endless possibilities it provides by checking exciting projects on ProjectHub **[5]**, the ArduinoÂź Library Reference **[6]** and the online store **[7]** where you will be able to complement your board with sensors, actuators and more. + +### Board Recovery +All ArduinoÂź boards have a built-in bootloader which allows flashing the board via USB. In case a sketch locks up the processor and the board is not reachable anymore via USB it is possible to enter bootloader mode by double-tapping the reset button right after power up. + +## Connector Pinouts +**Note 1:** All the pins on J1 and J2 (excluding fins) are referenced to the VDDIO_EXT voltage which can be generated internally or supplied externally. +**Note 2:** I2C1 is connected to the level translator U12 which has internal 10k pullups. R9 and R10 pullup resistors are not mounted on the board. + +### J1 Pin Connector + +| Pin | **Function** | **Type** | **Description** | +| --- | ------------ | -------- | ---------------------------------- | +| 1 | GPIO0_EXT | Digital | GPIO Pin 0 | +| 2 | NC | N/A | N/A | +| 3 | CS | Digital | SPI Cable Select | +| 4 | COPI | Digital | SPI Controller Out / Peripheral In | +| 5 | CIPO | Digital | SPI Controller In / Peripheral Out | +| 6 | SCLK | Digital | SPI Clock | +| 7 | ADC2 | Analog | Analog Input 2 | +| 8 | ADC1 | Analog | Analog Input 1 | + +### J2 Pin Header + +| Pin | **Function** | **Type** | **Description** | +| --- | ------------ | -------- | --------------------- | +| 1 | SDA | Digital | I2C Data Line | +| 2 | SCL | Digital | I2C Clock | +| 3 | GPIO1_EXT | Digital | GPIO Pin 1 | +| 4 | GPIO2_EXT | Digital | GPIO Pin 2 | +| 5 | GPIO3_EXT | Digital | GPIO Pin 3 | +| 6 | GND | Power | Ground | +| 7 | VDDIO_EXT | Digital | Logic Level Reference | +| 8 | N/C | N/A | N/A | +| 9 | VIN | Digital | Input Voltage | + +### J3 Fins + +| Pin | **Function** | **Type** | **Description** | +| --- | ------------ | -------- | ---------------------------- | +| P1 | SDA_PMIC | Digital | PMIC I2C Data Line | +| P2 | SCL_PMIC | Digital | PMIC I2C Clock Line | +| P3 | SWD | Digital | Data SWD JTAG Interface | +| P4 | SCK | Digital | Clock of SWD JTAG | +| P5 | NRST | Digital | Reset Pin | +| P6 | SWO | Digital | Output of SWD JTAG Interface | +| P7 | +1V8 | Power | +1.8V Voltage Rail | +| P8 | VOTP_PMIC | Digital | Reserved | + + +## Mechanical Information +![Nicla Vision Mechanical Drawing](assets/niclaVisionMech.svg) + +## Certifications +### Declaration of Conformity CE DoC (EU) +We declare under our sole responsibility that the products above are in conformity with the essential requirements of the following EU Directives and therefore qualify for free movement within markets comprising the European Union (EU) and European Economic Area (EEA). + +### Declaration of Conformity to EU RoHS & REACH 211 01/19/2021 +Arduino boards are in compliance with RoHS 2 Directive 2011/65/EU of the European Parliament and RoHS 3 Directive 2015/863/EU of the Council of 4 June 2015 on the restriction of the use of certain hazardous substances in electrical and electronic equipment. + +| **Substance** | **Maximum Limit (ppm)** | +| -------------------------------------- | ----------------------- | +| Lead (Pb) | 1000 | +| Cadmium (Cd) | 100 | +| Mercury (Hg) | 1000 | +| Hexavalent Chromium (Cr6+) | 1000 | +| Poly Brominated Biphenyls (PBB) | 1000 | +| Poly Brominated Diphenyl ethers (PBDE) | 1000 | +| Bis(2-Ethylhexyl} phthalate (DEHP) | 1000 | +| Benzyl butyl phthalate (BBP) | 1000 | +| Dibutyl phthalate (DBP) | 1000 | +| Diisobutyl phthalate (DIBP) | 1000 | + +Exemptions : No exemptions are claimed. + +Arduino Boards are fully compliant with the related requirements of European Union Regulation (EC) 1907 /2006 concerning the Registration, Evaluation, Authorization and Restriction of Chemicals (REACH). We declare none of the SVHCs (https://echa.europa.eu/web/guest/candidate-list-table), the Candidate List of Substances of Very High Concern for authorization currently released by ECHA, is present in all products (and also package) in quantities totaling in a concentration equal or above 0.1%. To the best of our knowledge, we also declare that our products do not contain any of the substances listed on the "Authorization List" (Annex XIV of the REACH regulations) and Substances of Very High Concern (SVHC) in any significant amounts as specified by the Annex XVII of Candidate list published by ECHA (European Chemical Agency) 1907 /2006/EC. + +### Conflict Minerals Declaration +As a global supplier of electronic and electrical components, Arduino is aware of our obligations with regards to laws and regulations regarding Conflict Minerals, specifically the Dodd-Frank Wall Street Reform and Consumer Protection Act, Section 1502. Arduino does not directly source or process conflict minerals such as Tin, Tantalum, Tungsten, or Gold. Conflict minerals are contained in our products in the form of solder, or as a component in metal alloys. As part of our reasonable due diligence Arduino has contacted component suppliers within our supply chain to verify their continued compliance with the regulations. Based on the information received thus far we declare that our products contain Conflict Minerals sourced from conflict-free areas. + +## FCC Caution +Any Changes or modifications not expressly approved by the party responsible for compliance could void the user’s authority to operate the equipment. + +This device complies with part 15 of the FCC Rules. Operation is subject to the following two conditions: + +(1) This device may not cause harmful interference + +(2) this device must accept any interference received, including interference that may cause undesired operation. + +**FCC RF Radiation Exposure Statement:** + +1. This Transmitter must not be co-located or operating in conjunction with any other antenna or transmitter. + +2. This equipment complies with RF radiation exposure limits set forth for an uncontrolled environment. + +3. This equipment should be installed and operated with minimum distance 20cm between the radiator & your body. + +English: +User manuals for license-exempt radio apparatus shall contain the following or equivalent notice in a conspicuous location in the user manual or alternatively on the device or both. This device complies with Industry Canada license-exempt RSS standard(s). Operation is subject to the following two conditions: + +(1) this device may not cause interference + +(2) this device must accept any interference, including interference that may cause undesired operation of the device. + +French: +Le prĂ©sent appareil est conforme aux CNR d’Industrie Canada applicables aux appareils radio exempts de license. L’exploitation est autorisĂ©e aux deux conditions suivantes: + +(1) l’appareil nedoit pas produire de brouillage + +(2) l’utilisateur de l’appareil doit accepter tout brouillage radioĂ©lectrique subi, mĂȘme si le brouillage est susceptible d’en compromettre le fonctionnement. + +**IC SAR Warning:** + +English +This equipment should be installed and operated with minimum distance 20 cm between the radiator and your body. + +French: +Lors de l’ installation et de l’ exploitation de ce dispositif, la distance entre le radiateur et le corps est d ’au moins 20 cm. + +**Important:** The operating temperature of the EUT can’t exceed 85℃ and shouldn’t be lower than -40℃. + +Hereby, Arduino S.r.l. declares that this product is in compliance with essential requirements and other relevant provisions of Directive 201453/EU. This product is allowed to be used in all EU member states. + +| Frequency bands | Typical Output Power | +| -------------------- | -------------------- | +| 2.4 GHz, 40 channels | TBC | + + +## Company Information + +| Company name | Arduino SRL | +| --------------- | -------------------------------------------- | +| Company Address | Via Andrea Appiani 25, 20900 Monza MB, Italy | + +## Reference Documentation + +| Ref | Link | +| ---------------------------------- | --------------------------------------------------------------------------------------------------- | +| ArduinoÂź IDE (Desktop) | https://www.arduino.cc/en/Main/Software | +| ArduinoÂź IDE (Cloud) | https://create.arduino.cc/editor | +| ArduinoÂź Cloud IDE Getting Started | https://create.arduino.cc/projecthub/Arduino_Genuino/getting-started-with-arduino-web-editor-4b3e4a | +| ArduinoÂź Pro Website | https://www.arduino.cc/pro | +| Online Store | https://store.arduino.cc/ | + +## Revision History + +| **Date** | **Revision** | **Changes** | +| ---------- | ------------ | --------------- | +| 03-09-2021 | 01 | Initial Version | diff --git a/content/hardware/05.nicla/boards/nicla-vision/downloads/ABX00051-full-pinout.pdf b/content/hardware/05.nicla/boards/nicla-vision/downloads/ABX00051-full-pinout.pdf new file mode 100644 index 0000000000..7f23726d21 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/downloads/ABX00051-full-pinout.pdf differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/downloads/ABX00051-schematics.pdf b/content/hardware/05.nicla/boards/nicla-vision/downloads/ABX00051-schematics.pdf new file mode 100644 index 0000000000..6ce6421ce9 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/downloads/ABX00051-schematics.pdf differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/essentials.md b/content/hardware/05.nicla/boards/nicla-vision/essentials.md new file mode 100644 index 0000000000..5f1c0b88b6 --- /dev/null +++ b/content/hardware/05.nicla/boards/nicla-vision/essentials.md @@ -0,0 +1,34 @@ + + + All you need to know to get started with your new Arduino board. + + + + + + + +The ArduinoBLE library is designed for Arduino boards that have hardware enabled for BLE and Bluetooth 4.0 and above. + + + +The PDM library allows you to use PDM (Pulse-density modulation) microphones, like the MP34DT06JTR. + + + + The WiFi library is designed to use the Murata 1DX module, which allows your Arduino to connect to the Internet. + + + + + + + Built-in Examples are sketches included in the Arduino IDE and demonstrate all basic Arduino commands. + + + Discover interesting articles, principles and techniques related to the Arduino ecosystem. + + + Arduino programming language can be divided in three main parts: functions, values (variables and constants), and structure. + + \ No newline at end of file diff --git a/content/hardware/05.nicla/boards/nicla-vision/features.md b/content/hardware/05.nicla/boards/nicla-vision/features.md new file mode 100644 index 0000000000..6a7df1009a --- /dev/null +++ b/content/hardware/05.nicla/boards/nicla-vision/features.md @@ -0,0 +1,43 @@ + + +The ArduinoÂź Nicla Vision is a powerful microcontroller equipped with 2MP color camera in a tiny form factor. With WiFi and BLE connectivity, the board maximizes compatibility with professional and consumer equipment. The board features an integrated microphone, distance sensor, smart 6-axis motion sensor and MicroPython support. The Nicla Vision can also be battery powered making it standalone. + + + + + + +The Arduino Nicla Vision is our smallest form factor yet. + + + + + +This 6-axis IMU allows to obtain 3D gyroscopic and 3D accelerometer data. It is also possible to do machine learning on the IMU for gesture detection, offloading computation load from the main processor. + + + + + + + + The Nicla Vision features an STM32H747AII6 Dual ARMÂź CortexÂź - M7 core up to 480 Mhz + M4 core up to 240 Mhz. + + + + + +The board uses the GC2145, a 2MP color camera. + + + + + + +The MP34DT06JTR digital MEMS microphone is omnidirectional and operate via a capacitive sensing element with a high +signal to noise ratio. + + + + + diff --git a/content/hardware/05.nicla/boards/nicla-vision/image.svg b/content/hardware/05.nicla/boards/nicla-vision/image.svg new file mode 100644 index 0000000000..2de59929c7 --- /dev/null +++ b/content/hardware/05.nicla/boards/nicla-vision/image.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/content/hardware/05.nicla/boards/nicla-vision/interactive/ABX00051-pinout.png b/content/hardware/05.nicla/boards/nicla-vision/interactive/ABX00051-pinout.png new file mode 100644 index 0000000000..d529a6d1c7 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/interactive/ABX00051-pinout.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/product.md b/content/hardware/05.nicla/boards/nicla-vision/product.md new file mode 100644 index 0000000000..2a3867ac03 --- /dev/null +++ b/content/hardware/05.nicla/boards/nicla-vision/product.md @@ -0,0 +1,8 @@ +--- +title: Nicla Vision +url_shop: https://store.arduino.cc/products/nicla-vision +url_guide: /software/ide-v1/installing-mbed-os-nicla-boards +core: arduino:mbed_nicla +--- + +The ArduinoÂź Nicla Vision is a ready-to-use, standalone camera for analyzing and processing images on the edge. Thanks to its 2MP color camera, smart 6-axis motion sensor, integrated microphone and distance sensor, it is suitable for asset tracking, object recognition and predictive maintenance. Quickly implement sensor nodes to send collected data to the ArduinoÂź Cloud (or third-party vendor services) via integrated WiFi/BLE connectivity. diff --git a/content/hardware/05.nicla/boards/nicla-vision/tech-specs.md b/content/hardware/05.nicla/boards/nicla-vision/tech-specs.md new file mode 100644 index 0000000000..e69de29bb2 diff --git a/content/hardware/05.nicla/boards/nicla-vision/tech-specs.yml b/content/hardware/05.nicla/boards/nicla-vision/tech-specs.yml new file mode 100644 index 0000000000..c0f52a64d2 --- /dev/null +++ b/content/hardware/05.nicla/boards/nicla-vision/tech-specs.yml @@ -0,0 +1,35 @@ +Board: + Name: ArduinoÂź Nicla Vision + SKU: ABX00051 +Microcontroller: STM32H757AII6 Dual Arm Cortex M7/M4 +USB connector: Micro USB (USB-B) +Pins: + LED built-in: 1 RGB LED (I2C) + Digital I/O Pins: 10 (of which 2 are shared with I2C and 4 are shared with SPI) + Analog input pins: 2, both shared with PWM + PWM pins: 12 (of which 2 are shared with analog, 2 are shared with I2C and 4 are shared with SPI) + External interrupts: 12 +Connectivity: + Bluetooth: Murata 1DX Bluetooth module + Wi-Fi: Yes + Secure element: NXP SE050C2 Crypto chip +Communication: + UART: Yes + I2C: 1 + SPI: 1 +Power: + Microcontroller operating voltage: 1.8V translated to 3.3V on external pins + Board Power Supply (USB/VIN): 5V + Supported battery: Li-ion/Li-Po Single Cell, 3.7V + Battery connector: JST 3-pin 1.2 mm pitch + DC Current per I/O pin: 4.7 mA +Clock speed: + Processor (M7): 480MHz + Processor (M4): 240MHz +Memory: + nRF52832 System-on-chip: 64kB SRAM, 512kB flash + QSPI flash: 16MB +Dimensions: + Weight: 2 g + Width: 22.86 mm (900mils) + Length: 22.86 mm (900mils) diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/blob_detection_example.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/blob_detection_example.png new file mode 100644 index 0000000000..62654e5736 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/blob_detection_example.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/histogram.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/histogram.png new file mode 100644 index 0000000000..501dcd1060 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/histogram.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/lab_thresholds_apple.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/lab_thresholds_apple.png new file mode 100644 index 0000000000..24c37b97bc Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/lab_thresholds_apple.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/lab_thresholds_banana.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/lab_thresholds_banana.png new file mode 100644 index 0000000000..0adb9ec599 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/lab_thresholds_banana.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/por_openmv_board_connected.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/por_openmv_board_connected.png new file mode 100644 index 0000000000..5b55ac9c76 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/por_openmv_board_connected.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/por_openmv_click_connect.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/por_openmv_click_connect.png new file mode 100644 index 0000000000..3aba99a990 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/por_openmv_click_connect.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/por_openmv_open_ide.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/por_openmv_open_ide.png new file mode 100644 index 0000000000..8bbb9a3496 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/por_openmv_open_ide.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/por_openmv_reset_firmware.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/por_openmv_reset_firmware.png new file mode 100644 index 0000000000..8038d0676f Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/assets/por_openmv_reset_firmware.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/content.md b/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/content.md new file mode 100644 index 0000000000..e68e2fa1cb --- /dev/null +++ b/content/hardware/05.nicla/boards/nicla-vision/tutorials/blob-detection/content.md @@ -0,0 +1,238 @@ +--- +title: Blob Detection with OpenMV +difficulty: intermediate +tags: [OpenMV, Blob Detection, Machine Vision] +description: This tutorial will show you how to use the Nicla Vision to detect the presence and the position of objects in a camera image. +author: Sebastian Romero +--- + +## Overview +In this tutorial you will use the ArduinoÂź Nicla Vision to detect the presence and the position of objects in a camera image. For that you will use a technique referred to as blob detection. For this task you will write a MicroPython script and run it on the Nicla Vision with the help of the OpenMV IDE. + +## Goals +- Learn how to use the OpenMV IDE to run MicroPython on Nicla Vision +- Learn how to use the built-in blob detection algorithm of OpenMV +- Learn how to use MicroPython to toggle the built-in LEDs + +### Required Hardware and Software +- [Nicla Vision board](https://store.arduino.cc/products/nicla-vision) +- Micro USB cable (either USB A to Micro USB or USB C to Micro USB) +- OpenMV IDE 2.6.4+ + +## Nicla Vision and the OpenMV IDE + +The OpenMV IDE was built for Machine Vision applications. It is meant to provide an Arduino like experience for simple computer vision tasks using a camera sensor. OpenMV comes with its own firmware that is built on MicroPython. Among other hardware it supports the Nicla Vision board. OpenMV allows you to easily preview the camera stream and visually inspect colour ranges to define thresholds for your machine vision scripts. [Here](https://openmv.io/) you can read more about the OpenMV IDE. + +## Instructions + +### Configuring the Development Environment +Before you can start programming OpenMV scripts for the Portenta you need to download and install the OpenMV IDE. Open the [OpenMV download](https://openmv.io/pages/download) page in your browser and download the version that you need for your operating system. Please Follow the instructions of the installer. + +### Flashing the OpenMV Firmware + +Connect the Portenta to your computer via the USB cable if you haven't done so yet. Put the Nicla Vision in bootloader mode by double pressing the reset button on the board. The built-in LED will start fading in and out. Now open the OpenMV IDE. + +![The OpenMV IDE after starting it](assets/por_openmv_open_ide.png) + +Click on the "connect" symbol at the bottom of the left toolbar. + +![Click the connect button to attach the Portenta to the OpenMV IDE](assets/por_openmv_click_connect.png) + +A pop-up will ask you how you would like to proceed "DFU bootloader(s) found. What would you like to do?". Select "Reset Firmware to Release Version". This will install the latest OpenMV firmware on the board. If it asks you whether it should erase the internal file system you can click "No". + +![Install the latest version of the OpenMV firmware](assets/por_openmv_reset_firmware.png) + +The board's LED will start flashing while the OpenMV firmware is being uploaded. A pop up window will open which shows you the upload progress. Wait until the LED stops flashing and fading. You will see a message saying "DFU firmware update complete!" when the process is done. + +***Installing the OpenMV firmware will overwrite any existing sketches in the internal Flash. As a result the board's port won't be exposed in the Arduino IDE anymore. To re-flash an Arduino firmware you need to put the board into bootloader mode. To do so double press the reset button on the board. The built-in LED will start fading in and out. In bootloader mode you will see the board's port again in the Arduino IDE.*** + +The Nicla Vision will start flashing its blue LED when it's ready to be connected. After confirming the completion dialog the board should already be connected to the OpenMV IDE, otherwise click the "connect" button (plug icon) once again. + +![When the Portenta is successfully connected to the OpenMV IDE a green play button appears in the lower left](assets/por_openmv_board_connected.png) + + +## Blob Detection + +In this section you will learn how to use the built-in blob detection algorithm to detect the location of objects in an image. That algorithm allows to detect areas in a digital image that differ in properties such as brightness or color compared to surrounding areas. These areas are called blobs. Think of a blob as a lump of similar pixels. + +Application Examples: + +- Detect specific vehicles passing in front of the camera +- Detect missing pieces in an assembly line +- Detect insect infestation on vegetables + +To find blobs you need to feed an image from the camera to the algorithm. It will then analyse it and output the coordinates of the found blobs. You will visualize these coordinates directly on the image and indicate whether a blob was found by using the red and green LED. + +### 1. Prepare the Script + +Create a new script by clicking the "New File" button in the toolbar on the left side. Import the required modules: + +```python +import pyb # Import module for board related functions +import sensor # Import the module for sensor related functions +import image # Import module containing machine vision algorithms +import time # Import module for tracking elapsed time +``` + +A module in Python is a confined bundle of functionality. By importing it into the script it gets made available. + +### 2. Preparing the Sensor + +In order to take a snapshot with the camera it has to be configured in the script. + +```python +sensor.reset() # Resets the sensor +sensor.set_pixformat(sensor.RGB565) # Sets the sensor to RGB +sensor.set_framesize(sensor.QVGA) # Sets the resolution to 320x240 px +sensor.set_vflip(True) # Flips the image vertically +sensor.set_hmirror(True) # Mirrors the image horizontally +sensor.skip_frames(time = 2000) # Skip some frames to let the image stabilize +``` + +The most relevant functions in this snipped are `set_pixformat` and `set_framesize`. The camera that comes with the Nicla Vision supports RGB 565 images. Therefore we need to set it via the `sensor.RGB565` parameter. + +The resolution of the camera needs to be set to a supported format both by the sensor and the algorithm. `QVGA` is a good trade-off between performance and resolution so you will use that in this tutorial. + +Depending on how you hold the camera you may want to play with the `set_vflip` and `set_hmirror` functions. To hold the board with the USB cable facing down you will need to call `set_vflip(True)`. If you want the image to be displayed the same way as you see it with your eyes, you need to call `sensor.set_hmirror(True)`. Otherwise elements in the image such as text would be mirrored. + +### 3. Defining the Color Thresholds + +In order to feed the blob detection algorithm with an image you have to take a snapshot from the camera or load the image from memory (e.g. SD card or internal Flash). In this case you will take a snapshot using the `snapshot()` function. The resulting image needs then to be fed to the algorithm using the `find_blobs` function. You will notice that a list of tuples gets passed to the algorithm. In this list you can specify the LAB color values that are mostly contained in the object that you would like to track. If you were for example to detect purely red objects on a black background the resulting range of colors would be very narrow. The corresponding LAB value for pure red is roughly (53,80,67). A slightly brighter red could be (55,73,50). Therefore the LAB range would be L: 53-55 A: 73-80 B: 50-67. OpenMV provides a convenient tool to figure out the desired color ranges: Threshold Editor. You can find it in the OpenMV IDE in the menu under **Tools > Machine Vision > Threshold Editor**. Place the desired object in front of the camera and open the tool. When it asks you about the "Source image location?" select "Frame Buffer". In the window that opens you will see a snapshot from the camera and a few sliders to adjust the LAB color ranges. As you move the sliders you will see in the black and white image on the right hand side which of the pixels would match the set color range. White pixels denote the matching pixels. As you can see in the following example, the pixels of a nice red apple on brown background are very nicely clustered. It results in mostly one big blob. + +![LAB thresholds for an apple in the Threshold Editor](assets/lab_thresholds_apple.png) + +To get a rough idea of the LAB color range of the target object you can use the histogram view in OpenMV. Make sure you have set the histogram to LAB color mode. Draw a rectangle with the mouse pointer just above the target object in the frame buffer view. In the histogram you can see which color values appear most often. You can set the target color ranges to the min and max values of the corresponding color component. + +![LAB color histogram of frame buffer](assets/histogram.png) + +As opposed to the example above with the apple, the clustering of the banana's pixels is slightly less coherent. This is because the banana lies on a background that has slightly similar color. That means that the algorithm is sensitive to the background pixels. In order to exclude blobs that don't belong to the target object additional filtering is necessary. You can for example set a minimum bounding box size, a blob pixel density, define the elongation of the object, its roundness or even just look for objects in a specific part of the image. + +![LAB thresholds for a banana in the Threshold Editor](assets/lab_thresholds_banana.png) + +### 4. Detecting Blobs + +Now that you know the range of color values to be used to find the blobs you can pass these 6 tuples as a list to the `find_blobs` function: + +```python +# Define the min/max LAB values we're looking for +thresholdsApple = (24, 60, 32, 54, 0, 42) +thresholdsBanana = (45, 75, 5, -10, 40, 12) +img = sensor.snapshot() # Takes a snapshot and saves it in memory + +# Find blobs with a minimal area of 50x50 = 2500 px +# Overlapping blobs will be merged +blobs = img.find_blobs([thresholdsApple, thresholdsBanana], area_threshold=2500, merge=True) +``` + +Once the blobs are detected you may be interested in seeing where in the image they were found. This can be done by drawing directly on the camera image. + +```python +# Draw blobs +for blob in blobs: + # Draw a rectangle where the blob was found + img.draw_rectangle(blob.rect(), color=(0,255,0)) + # Draw a cross in the middle of the blob + img.draw_cross(blob.cx(), blob.cy(), color=(0,255,0)) +``` + +If you need to know which blob matched which color threshold you can use the `blob.code()` function (see [here](https://docs.openmv.io/library/omv.image.html#image.image.blob.blob.code) for more information). + +The result of that will be visible in the Frame Buffer preview panel on the right side of the OpenMV IDE. + +![Visualisation of the blobs in the frame buffer preview](assets/blob_detection_example.png) + +### 5. Toggling LEDs + +What if you want some visual feedback from the blob detection without any computer connected to your board? You could use for example the built-in LEDs to indicate whether or not a blob was found in the camera image. Let's initialise the red and the green LEDs with the following code: + +```python +ledRed = pyb.LED(1) # Initiates the red led +ledGreen = pyb.LED(2) # Initiates the green led +``` + +And then add the logic that will turn on the appropriate LED if a blob is present. This part of the code will be added after the "Draw Blobs" logic. + +```python +# Turn on green LED if a blob was found +if len(blobs) > 0: + ledGreen.on() + ledRed.off() +else: +# Turn the red LED on if no blob was found + ledGreen.off() + ledRed.on() +``` + +In this example the green LED will light up when there is at least one blob found in the image. The red LED will light up if no blob could be found. + +### 6. Uploading the Script +Let's program the board with the complete script and test if the algorithm works. Copy the following script and paste it into the new script file that you created. + +```python +import pyb # Import module for board related functions +import sensor # Import the module for sensor related functions +import image # Import module containing machine vision algorithms +import time # Import module for tracking elapsed time + +sensor.reset() # Resets the sensor +sensor.set_pixformat(sensor.RGB565) # Sets the sensor to RGB +sensor.set_framesize(sensor.QVGA) # Sets the resolution to 320x240 px +sensor.set_vflip(True) # Flips the image vertically +sensor.set_hmirror(True) # Mirrors the image horizontally +sensor.skip_frames(time = 2000) # Skip some frames to let the image stabilize + +# Define the min/max LAB values we're looking for +thresholdsApple = (24, 60, 32, 54, 0, 42) +thresholdsBanana = (45, 75, 5, -10, 40, 12) + +ledRed = pyb.LED(1) # Initiates the red led +ledGreen = pyb.LED(2) # Initiates the green led + +clock = time.clock() # Instantiates a clock object + +while(True): + clock.tick() # Advances the clock + img = sensor.snapshot() # Takes a snapshot and saves it in memory + + # Find blobs with a minimal area of 50x50 = 2500 px + # Overlapping blobs will be merged + blobs = img.find_blobs([thresholdsApple, thresholdsBanana], area_threshold=2500, merge=True) + + # Draw blobs + for blob in blobs: + # Draw a rectangle where the blob was found + img.draw_rectangle(blob.rect(), color=(0,255,0)) + # Draw a cross in the middle of the blob + img.draw_cross(blob.cx(), blob.cy(), color=(0,255,0)) + + # Turn on green LED if a blob was found + if len(blobs) > 0: + ledGreen.on() + ledRed.off() + else: + # Turn the red LED on if no blob was found + ledGreen.off() + ledRed.on() + + pyb.delay(50) # Pauses the execution for 50ms + print(clock.fps()) # Prints the framerate to the serial console +``` + +Click on the "Play" button at the bottom of the left toolbar. Place some objects on your desk and check if the Portenta can detect them. + +***The MicroPython script doesn't get compiled and linked into an actual firmware. Instead it gets copied to the internal Flash of the board where it gets interpreted and executed on the fly.*** + +## Conclusion + +In this tutorial you learned how to use the OpenMV IDE to develop MicroPython scripts that then run on the Nicla Vision. You also learned how to configure the camera of the Nicla Vision to be used for machine vision applications in OpenMV. Last but not least you learned how to interact with the built-in LEDs in MicroPython on the OpenMV firmware. + +### Next Steps +- Familiarize yourself with the OpenMV IDE. There are many other features that didn't get mentioned in this tutorial (e.g. the Serial Terminal). +- Try out other machine vision examples that come with the OpenMV IDE. You can find them in the "Examples" menu. + +## Troubleshooting +### OpenMV Firmware Flashing Issues +- If the upload of the OpenMV firmware fails during the download, put the board back in boot loader mode and try again. Give it a few tries until the firmware gets successfully uploaded. +- If the upload of the OpenMV firmware fails without even starting, try uploading the latest firmware using the "Load Specific Firmware File" option. You can find the latest firmware on the [OpenMV Github repository](https://github.com/openmv/openmv/releases). Look for a file called **firmware.bin** in the NVISION folder. +- If the camera cannot get recognized by the OpenMV IDE or if you see a "No OpenMV Cams found!" message, press the reset button of the board once and wait until you see the blue LED flashing. Then try again connecting to the board. +- If you see a "OSError: Reset Failed" message, reset the board by pressing the reset button. Wait until you see the blue LED flashing, connect the board to the OpenMV IDE and try running the script again. diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/getting-started/assets/openmv-nicla-vision-camera.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/getting-started/assets/openmv-nicla-vision-camera.png new file mode 100644 index 0000000000..cfc89df2e1 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/getting-started/assets/openmv-nicla-vision-camera.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/getting-started/assets/openmv_board_connected.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/getting-started/assets/openmv_board_connected.png new file mode 100644 index 0000000000..5b55ac9c76 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/getting-started/assets/openmv_board_connected.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/getting-started/assets/openmv_click_connect.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/getting-started/assets/openmv_click_connect.png new file mode 100644 index 0000000000..3aba99a990 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/getting-started/assets/openmv_click_connect.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/getting-started/assets/openmv_firmware_updater.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/getting-started/assets/openmv_firmware_updater.png new file mode 100644 index 0000000000..0c388077d0 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/getting-started/assets/openmv_firmware_updater.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/getting-started/assets/openmv_open_ide.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/getting-started/assets/openmv_open_ide.png new file mode 100644 index 0000000000..8bbb9a3496 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/getting-started/assets/openmv_open_ide.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/getting-started/assets/openmv_reset_firmware.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/getting-started/assets/openmv_reset_firmware.png new file mode 100644 index 0000000000..8038d0676f Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/getting-started/assets/openmv_reset_firmware.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/getting-started/content.md b/content/hardware/05.nicla/boards/nicla-vision/tutorials/getting-started/content.md new file mode 100644 index 0000000000..132189d045 --- /dev/null +++ b/content/hardware/05.nicla/boards/nicla-vision/tutorials/getting-started/content.md @@ -0,0 +1,203 @@ +--- +title: 'Getting Started with Nicla Vision' +description: 'This tutorial teaches you how to set up the board, how to use the OpenMV IDE and how to run a MicroPython sketch.' +difficulty: easy +tags: + - Getting Started + - OpenMV + - Setup + - MicroPython +author: 'Benjamin DannegĂ„rd' +libraries: + - name: MicroPython + url: http://docs.MicroPython.org/en/latest/ +software: + - openMV +--- + +## Overview +The OpenMV IDE is meant to provide an Arduino like experience for simple machine vision tasks using a camera sensor. In this tutorial, you will learn about some of the basic features of the OpenMV IDE and how to create a simple MicroPython script. The Nicla Vision board has OpenMV firmware on the board by default, making it easy to connect to the OpenMV IDE. + +## Goals +- The basic features of the OpenMV IDE +- How to create a simple MicroPython script +- How to use the OpenMV IDE to run MicroPython on Nicla Vision + + +### Required Hardware and Software +- [Nicla Vision board](https://store.arduino.cc/products/nicla-vision) +- Micro USB cable (either USB A to Micro USB or USB C to Micro USB) +- OpenMV IDE 2.6.4+ + +## Instructions + +Using the OpenMV IDE you can run [MicroPython](http://docs.MicroPython.org/en/latest/) scripts on the Nicla Vision board. MicroPython provides a lot of classes and modules that make it easy to quickly explore the features of the Nicla Vision. In this tutorial you will first download the OpenMV IDE and set up the development environment. [Here](https://openmv.io/) you can read more about the OpenMV IDE. OpenMV comes with its own firmware that is built on MicroPython. You will then learn to write a simple script that will blink the on-board RGB LED using some basic MicroPython commands. + +### 1. Downloading the OpenMV IDE + +Before you can start programming OpenMV scripts for the Nicla Vision you need to download and install the OpenMV IDE. + +Open the [OpenMV download](https://openmv.io/pages/download) page in your browser, download the version that you need for your operating system and follow the instructions of the installer. + +### 2. Connecting to the OpenMV IDE + +Connect the Nicla Vision to your computer via the USB cable if you haven't done so yet. + +![The OpenMV IDE after starting it](assets/openmv_open_ide.png) + +Click on the "connect" symbol at the bottom of the left toolbar. + +![Click the connect button to attach the Nicla Vision to the OpenMV IDE](assets/openmv_click_connect.png) + +A pop-up will ask you how you would like to proceed. Select "Reset Firmware to Release Version". This will install the latest OpenMV firmware on the Nicla Vision. You can leave the option of erasing the internal file system unselected and click "OK". + +![Install the latest version of the OpenMV firmware](assets/openmv_reset_firmware.png) + +Nicla Vision's green LED will start flashing while the OpenMV firmware is being uploaded to the board. A terminal window will open which shows you the flashing progress. Wait until the green LED stops flashing and fading. You will see a message saying "DFU firmware update complete!" when the process is done. + +![Installing firmware on Nicla Vision board in OpenMV](assets/openmv_firmware_updater.png) + +The board will start flashing its blue LED when it's ready to be connected. After confirming the completion dialog the Nicla Vision should already be connected to the OpenMV IDE, otherwise click the "connect" button (plug symbol) once again. + +![When the Nicla Vision is successfully connected a green play button appears](assets/openmv_board_connected.png) + +### 3. Preparing the Script + +Create a new script by clicking the "New File" button in the toolbar on the left side. Import the required module `pyb`: + +```python +import pyb # Import module for board related functions +``` + +A module in Python is a confined bundle of functionality. By importing it into the script it gets made available. For this example we only need `pyb`, which is a module that contains board related functionality such as PIN handling. You can read more about its functions [here](https://docs.micropython.org/en/latest/library/pyb.html). + +Now we can create the variables that will control our built-in RGB LED. With `pyb` we can easily control each color. + +```python +redLED = pyb.LED(1) # built-in red LED +greenLED = pyb.LED(2) # built-in green LED +blueLED = pyb.LED(3) # built-in blue LED +``` + +Now we can easily distinguish between which color we control in the script. + +### 4. Creating the Main Loop in the Script + +Putting our code inside a while loop will make the code run continuously. In the loop we turn on an LED with `on`, then we use the `delay` function to create a delay. This function will wait with execution of the next instruction in the script. The duration of the delay can be controlled by changing the value inside the parentheses. The number defines how many milliseconds the board will wait. After the specified time has passed, we turn off the LED with the `off` function. We repeat that for each color. + +```python +while True: + # Turns on the red LED + redLED.on() + # Makes the script wait for 1 second (1000 milliseconds) + pyb.delay(1000) + # Turns off the red LED + redLED.off() + pyb.delay(1000) + greenLED.on() + pyb.delay(1000) + greenLED.off() + pyb.delay(1000) + blueLED.on() + pyb.delay(1000) + blueLED.off() + pyb.delay(1000) +``` + +### 5. Uploading the Script + +Here you can see the complete blink script: + +```python +import pyb # Import module for board related functions + +redLED = pyb.LED(1) # built-in red LED +greenLED = pyb.LED(2) # built-in green LED +blueLED = pyb.LED(3) # built-in blue LED + +while True: + + # Turns on the red LED + redLED.on() + # Makes the script wait for 1 second (1000 milliseconds) + pyb.delay(1000) + # Turns off the red LED + redLED.off() + pyb.delay(1000) + greenLED.on() + pyb.delay(1000) + greenLED.off() + pyb.delay(1000) + blueLED.on() + pyb.delay(1000) + blueLED.off() + pyb.delay(1000) +``` + +Connect your board to the OpenMV IDE and upload the above script by pressing the play button in the lower left corner. + +![Press the green play button to upload the script](assets/openmv_board_connected.png) + +Now the built-in LED on your Nicla Vision board should be blinking red, green and then blue repeatedly. + +## Using the Nicla Vision Camera + +You can easily access the camera on the Nicla Vision through OpenMV IDE. Below is a short script that will set up the camera and take an image. The board will blink its LED to indicate when it will take the picture. The image can be seen in the frame buffer while the script is running. + +```python +import pyb # Import module for board related functions +import sensor # Import the module for sensor related functions +import image # Import module containing machine vision algorithms + +redLED = pyb.LED(1) # built-in red LED +blueLED = pyb.LED(3) # built-in blue LED + +sensor.reset() # Initialize the camera sensor. +sensor.set_pixformat(sensor.RGB565) # Sets the sensor to RGB +sensor.set_framesize(sensor.QVGA) # Sets the resolution to 320x240 px +sensor.set_vflip(True) # Flips the image vertically +sensor.set_hmirror(True) # Mirrors the image horizontally + +redLED.on() +sensor.skip_frames(time = 2000) # Skip some frames to let the image stabilize + +redLED.off() +blueLED.on() + +print("You're on camera!") +sensor.snapshot().save("example.jpg") + +blueLED.off() +print("Done! Reset the camera to see the saved image.") +``` + +The camera that comes with the Nicla Vision supports RGB 565 images. That's why we use `sensor.set_pixformat(sensor.RGB565)`, enabling the camera to take an image with color. Then we need to set the resolution of the camera. Here we will use `sensor.set_framesize(sensor.QVGA)`. + +Using `sensor.set_vflip` and `sensor.set_hmirror` will help us set the correct orientation of the image. If you hold the board with the USB cable facing down you want to call `sensor.set_vflip(True)`. The image will be mirrored, if you want the image to be displayed as you see it from your perspective, you want to call `sensor.set_hmirror(True)`. + +Running this script in OpenMV will show the image that the camera is currently capturing in the top right corner, inside the frame buffer. The onboard red LED will be on for a couple of seconds, then the blue LED will turn on, this indicates when the picture is about to be taken. A message will be printed in the serial terminal when the image is taken. + +![Where to see the captured image in OpenMV](assets/openmv-nicla-vision-camera.png) + +The image will be saved as "example.jpg" in the boards directory. It is also possible to save the image in a ".bmp" format. If you reset the camera by pressing the reset button the image file will appear in the boards directory. + +## Using the Nicla Vision with Arduino IDE + +As mentioned before, the Nicla Vision comes with OpenMV firmware pre installed. This makes it easier to use the board with OpenMV out of the box. It is possible to use the Nicla Vision with the Arduino IDE. First make sure that you have the latest core installed. To install the core navigate into **Tools > Board > Boards Manager...**, in the Boards Manager window search for **Nicla Vision MBED** and install it. When this core is installed and you have your board connected to your computer, select the port that the board is connected to and the boards core. You should now be able to upload an Arduino sketch to the board. + +If you wish to use the board with OpenMV after it has been used with the Arduino IDE, you have to put the board into bootloader mode and install OpenMV firmware. You do this by double pressing the reset button, located next to the LED. When the board is in bootloader mode and connected to your computer, follow the steps above in the **2. Connecting to the OpenMV IDE** section to connect the board to the OpenMV IDE again. + +## Conclusion +In this tutorial you learned how to use the OpenMV IDE with your Nicla Vision board. You also learned how to control the Nicla Vision's RGB LED with MicroPython functions and to upload the script to your board using the OpenMV IDE. + +### Next Steps +- Experiment with MicroPythons capabilities. If you want some examples of what to do, take a look at the examples included in the OpenMV IDE. Go to: **File > Examples > Arduino > ** in the OpenMV IDE. +- It is possible to use the board for more advanced image processing tasks. Be sure to take a look at our other tutorials if you want to learn more. +- Take a look at our other Nicla Vision tutorials which showcase its many uses. You can find them [here](https://docs.arduino.cc/hardware/nicla-vision#tutorials) + +## Troubleshooting + +### OpenMV Firmware Flashing Issues +- If the upload of the OpenMV firmware fails during the download, put the board back in bootloader mode and try again. Repeat until the firmware gets successfully uploaded. +- If the OpenMV IDE still can't connect after flashing the firmware, try uploading the latest firmware using the "Load Specific Firmware File" option. You can find the latest firmware in the [OpenMV Github repository](https://github.com/openmv/openmv/releases). Look for a file named **firmware.bin**. +- If you see a "OSError: Reset Failed" message, reset the board by pressing the reset button. Wait until you see the blue LED flashing, connect the board to the OpenMV IDE and try running the script again. diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/data_split_ratio.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/data_split_ratio.png new file mode 100644 index 0000000000..51aca580eb Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/data_split_ratio.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/deployment.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/deployment.png new file mode 100644 index 0000000000..31124114e4 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/deployment.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/dsp_parameters.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/dsp_parameters.png new file mode 100644 index 0000000000..8aad10eddc Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/dsp_parameters.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/edge_impulse_classification.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/edge_impulse_classification.png new file mode 100644 index 0000000000..351fe0102a Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/edge_impulse_classification.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/edge_impulse_login.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/edge_impulse_login.png new file mode 100644 index 0000000000..e5030853b0 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/edge_impulse_login.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/edge_impulse_training.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/edge_impulse_training.png new file mode 100644 index 0000000000..2f4d414e93 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/edge_impulse_training.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/features.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/features.png new file mode 100644 index 0000000000..8b968e496c Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/features.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/github_actions.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/github_actions.png new file mode 100644 index 0000000000..7b2382db06 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/github_actions.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/github_model_path.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/github_model_path.png new file mode 100644 index 0000000000..56925b1b15 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/github_model_path.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/github_releases.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/github_releases.png new file mode 100644 index 0000000000..96c1d7ca21 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/github_releases.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/ml_edge_impulse_design.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/ml_edge_impulse_design.png new file mode 100644 index 0000000000..dd4296e3c3 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/ml_edge_impulse_design.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/ml_supervised_learning.svg b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/ml_supervised_learning.svg new file mode 100644 index 0000000000..af542369d8 --- /dev/null +++ b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/ml_supervised_learning.svg @@ -0,0 +1,81 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/model_testing.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/model_testing.png new file mode 100644 index 0000000000..eb73edf0cf Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/model_testing.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/omv_new_dataset.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/omv_new_dataset.png new file mode 100644 index 0000000000..da2082bdf6 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/omv_new_dataset.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/vs_openmv_ml_classes.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/vs_openmv_ml_classes.png new file mode 100644 index 0000000000..5341ae783f Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/vs_openmv_ml_classes.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/vs_openmv_ml_edge_impulse_data.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/vs_openmv_ml_edge_impulse_data.png new file mode 100644 index 0000000000..b9aaa811dd Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/assets/vs_openmv_ml_edge_impulse_data.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/content.md b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/content.md new file mode 100644 index 0000000000..b0d9bff323 --- /dev/null +++ b/content/hardware/05.nicla/boards/nicla-vision/tutorials/image-classification/content.md @@ -0,0 +1,226 @@ +--- +title: Image Classification with Edge Impulse +difficulty: intermediate +tags: [Machine Learning, Edge Impulse, TinyML, Tensorflow] +description: This tutorial teaches you how to train a custom machine learning model with Edge Impulse and to do image classification on the Arduino Nicla Vision. +author: Sebastian Romero +--- + +## Overview + +This tutorial teaches you how to train a custom machine learning model with Edge Impulse and to do image classification on the Arduino Nicla Vision. The Machine Learning (ML) model will use the TensorFlow Lite format and the classification example will run on OpenMV. + +## Goals + +- Learn how to create datasets to be used for classification +- Learn how to train a ML model in Edge Impulse +- Learn how to use OpenMV to run a classification example +- Learn how to embed a ML model in the OpenMV firmware + +## Required Hardware and Software + +- [Nicla Vision board](https://store.arduino.cc/nicla-vision) +- Micro USB cable +- An [Edge Impulse](https://studio.edgeimpulse.com/) account for training the ML model +- Fruits (or other objects) to create the classification model 🍏🍌🍐 + +## Machine Learning on the Edge + +Machine learning on powerful computers has been around for a while. On microcontrollers this is a rather new territory. Microcontrollers might not be able to run ML models to process high resolution images at high frame rates but there are some interesting aspects. On the one hand microcontrollers can run at very low power on batteries for a long time. You could even put the processor to sleep and only wake it up when the camera or the on-board proximity sensor registers activity. On the other hand ML models on a microcontroller can run without internet connection as they don't need to upload data to the cloud. This means that you can install distributed ML solutions in places where there is no Internet connection (Edge Computing). Additionally processing data locally means that the data stays on the device which ensures data privacy. + +## The Edge Impulse Platform + +Edge Impulse is a platform that simplifies the process of creating machine learning models by choosing reasonable defaults for the countless parameters you could set when creating a ML model. It provides a simple user interface that not only allows to train a ML model but also to inspect the data and test the model. + +## Training the ML Model + +To train a ML model to classify an image we need to feed it with image data of that object. During the training process the model will be trained using a concept called [supervised learning](https://en.wikipedia.org/wiki/Supervised_learning). This means that we train the model with known data and tell it while it's "practicing" its predictions if they are correct or not. This is similar to what happens when you tell a toddler who is pointing at a donkey saying "horse" and you tell them that it's actually a donkey. The next few times they see a donkey they may still get it wrong but over time under your supervision they will learn to correctly identify a donkey. Conceptually, that's also how our ML model learns. + +![For supervised learning objects are labeled beforehand with their names](assets/ml_supervised_learning.svg) + +### Overfitting + +One thing to consider is overfitting. If a machine learning model is overfitting, it means that it is too well geared towards your training data and won't perform well with unseen input data. To get back to the above example, once the toddler has seen many donkeys and all of them had perfectly gray fur, all were 170 cm long and 127 cm tall, they have learned that a donkey must be exactly like that, otherwise it's not a donkey. If now a donkey shows up that is slightly taller, the toddler would have to think that it's not a donkey. And even if there was one brownish, slightly taller donkey that the toddler has seen many times, it wouldn't necessarily help as the toddler may just remember that one specific donkey that looks a bit odd. It may not have learned that donkeys can actually have different shades of color in their fur. + +You need some variation in the training dataset and adjust the parameters so that it doesn't just learn all input data by heart and makes the classification based on that but you rather want the model to learn the concept of an object. Luckily in the real world this rarely ever happens. In machine learning however, it's a common pitfall. + +To find the right configuration for your application often requires trial and error. Edge Impulse shows in [this article](https://docs.edgeimpulse.com/docs/increasing-model-performance) how to improve poorly performing machine learning models. + +### 1. Creating a Data Set + +The first step is to create a representative dataset of the objects that the ML model is supposed to identify. The key is to have as much diversity in the models as possible. If we show it for example only one specific apple that has a certain size, shape and peel, then it won't be very good at recognizing other apples that look different. This is referred to as a bias and should be avoided as much as possible. In addition you need to teach the model what an apple is not. For that purpose you feed it random image data of things that are not an apple. You could name that class of image data "unknown". If you don't have such a class and the model has only ever seen an apple, it won't know what to do if there is no apple in the image. + +Creating data sets in OpenMV is simple as there is a built-in function to create them. Before you proceed, connect your Nicla Vision board. Click on the connect button in the OpenMV IDE. If you haven't set up your board for OpenMV please consult the [getting started tutorial](https://docs.arduino.cc/tutorials/nicla-vision/getting-started). +Create a new dataset by using the menu command **Tools->Dataset Editor->New Dataset** and name it `Dataset-Fruits`. + +![The Dataset Editor can be found in the Tools menu](assets/omv_new_dataset.png) + +The next step is to create image classes. A class represents a unique type of object, in this case the type of fruit. +First, create a new image class and name it `orange` by clicking on "New Class Folder" in the toolbar. Now run the image capturing script that is already open by clicking the play button. Focus the orange with the camera and click on **Capture Data** to snap a picture of it. To conveniently hold the camera with the cable facing down you can use the following lines of code to flip the image accordingly: + +```python +sensor.set_vflip(True) # Flips the image vertically +sensor.set_hmirror(True) # Mirrors the image horizontally +``` + + Capture it from different angles and with different backgrounds to make the recognition later on more robust. Repeat this for other fruits that you would like to classify (e.g. a pear and a banana). Add an `unknown` class and capture some images of different backgrounds without the fruits that you would like to use during the classification later on. + +![The various image classes can be created directly in the dataset editor](assets/vs_openmv_ml_classes.png) + +You may have also noticed that there is a labels text file. This file is used to store a textual representation of the classes to later classify the objects and print the class names. The classes are added to that automatically. + +***Please note that creating a machine learning model with training data based around just one specific piece of fruit while always using the same background does not create a robust model. It will perform well in the controlled environment but will struggle when being presented with new data*** + +### 2. Uploading the Data to Edge Impulse +Now that all data is ready to be uploaded you need to create a new Edge Impulse project. If you haven't registered an Edge Impulse account yet, you may create one on [their website](https://studio.edgeimpulse.com/login). Log in to the Edge Impulse Studio and create a new project named `Fruit-Detector`. + +After that you can go back to the OpenMV IDE and select **Tools->Dataset Editor->Export->Log in to Edge Impulse Account and Upload to Project**. The OpenMV IDE will ask you for your Edge Impulse login credentials. Select the project that you just created and click OK. Leave the data set split setting at the default. This will keep 20% of the images aside for testing the model once it has been trained. That allows you to assess how well your model performs at detecting the objects with data that it hasn't seen yet. + +![You need to log in with your Edge Impulse account when uploading a dataset for the first time](assets/edge_impulse_login.png) + + +### 3. Acquire Data + +Open your project in the Edge Impulse studio and navigate to "Data Acquisition". You can see that the images have been uploaded and labeled according to the classes that you created. With this tool you can browse through the image samples and remove the ones which you don't deem valuable for the training (e.g. if one of the images is too blurry). You could also do that in the OpenMV IDE before you upload the data. + +![The Data Acquisition tool allows to inspect the uploaded assets](assets/vs_openmv_ml_edge_impulse_data.png) + + + +Make sure to have a good training / test data split ratio of around 80/20. The test data is used to test the model with "unseen" data after the training has finished. If you have an overfitting model you may see high accuracy in the training results but poor performance in the testing results. If that's the case you may have to tweak the parameters or collect more / better training data. More information on this can be found in the Edge Impulse documentation referenced above. + + + +![The split ratio between training data and test data should be around 80/20](assets/data_split_ratio.png) + +### 4. Create an Impulse + +If you're happy with the data samples you can move on to designing your impulse. An impulse is in a nutshell a recipe with which the model is being trained. It defines actions that are performed on your input data to make them better suited for machine learning and a learning block that defines the algorithm for the classification. In the menu navigate to "Create Impulse" under "Impulse Design" and add an **Image** processing block as well as a **Transfer Learning** learning block. +It's recommended to adjust the image size to 48x48 for improved performance. You can try with higher resolutions but you will notice that the frame rate during the classification will drop significantly. Click on Save Impulse to apply the adjusted settings. + +![An Impulse consists of the building blocks needed to train a ML model](assets/ml_edge_impulse_design.png) + + +### 5. Generate Features + +In this step you will adjust the image settings and generate the features from the input data. Features are unique properties that will be used by the classification algorithm to detect the objects. A feature can be the round shape of an orange or the fact that an image of a banana has many bright pixels as bananas are mostly yellow. +In the menu navigate to "Image" under "Impulse Design". Set the color depth to "GB" and save the parameters. + +![In the image inspection tool you can set the color depth according to the input data](assets/dsp_parameters.png) + +Then click on "Generate Features". The analysis process will take a while to complete depending on the amount of images that you uploaded. When it's done you can inspect the results. On the right hand side you can see a visualization of the features in a 3D space. You can see that some bananas (blue dots) and pears (green dots) are somewhat hard to tell apart possibly due to their long-ish shape and the stem and therefore have some data points in close proximity. An orange on the other hand is easier to distinguish as it looks quite different. + +![The feature explorer allows to visually inspect the clusters of images in regards to their properties](assets/features.png) + +### 6. Train the Model + +Now that the features of your image data are ready to be used for the actual training you can navigate to "Transfer Learning" in the menu. You need to tweak the settings slightly. Set the "Number of training cycles" to a number that yields good results. In this example we chose 80. This defines how many times the model is being trained. The model gets better with each cycle the same way you get better when learning how to ride a bike and you practice it the first couple of times. + +***Choose `MobileNetV2 96x96 0.1` as model type. This will use roughly 200 KB of flash memory. A model with higher ROM usage will likely not fit in the flash!*** + +In this example we also increased the drop out rate to 0.15 and the output neurons to 12. This increased the accuracy with the given training / test data. You may need to adapt those values based on your own data. +Click on "Start Training" to train the machine learning model. A small amount of images, the **validation set**, are put aside before the training starts to validate the trained model. Not to be confused with the **test set** which can be used to evaluate the final model. Once the training finishes you will see some statistics on how well the model performed during validation. Ideally you get an accuracy of 100% for each object. If you get poor results you may have some images which are not representative of the objects you're trying to classify and should be removed from the data set. + +![The confusion matrix shows the accuracy of the ML model after the last training cycle](assets/edge_impulse_training.png) + +## 7. Test the Model + +After training the model, you will have an idea of how well the model performs on the data that it knows from the training. That is only half of the story. You also need to know how well it performs on unseen data. In almost any real-world application a model will be confronted only with unseen data. Being able to cope with that is crucial. Edge Impulse studio provides a tool to easily test the model. You can find it under "Model Testing". The model testing results will give you an insight on the performance. If the model gets bad results while testing, but had a good accuracy after training it may be overfitting. + +You may ask yourself why this model performs so well even if the model is not robust at all. It's because the data used for testing comes from the same controlled environment as the learning data. The test images have the same background and feature the exact same fruits as the training images. If you wait a few days until the banana becomes brown, you will see a decrease in performance. + + + +![The testing results give you a better understanding on how the model performs with unseen data.](assets/model_testing.png) + +## Using the ML Model + +The ML model is trained and already optimized to be used with microcontrollers. This is done automatically in the background through quantization. This is a process where the numbers in the machine learning models are constrained in their value range for improved performance while sacrificing a bit of accuracy. + +### Deploy + +Deploying the ML model to your board requires a few steps. The Edge Impulse Studio provides an export feature for OpenMV. Switch to the deployment section in the menu, select OpenMV under "Build firmware" and click "build". This will create an OpenMV compatible library and download it as a zip file. Unzip it. + +![The Edge Impulse Studio has a built-in export function for OpenMV](assets/deployment.png) + +Since the Nicla Vision doesn't have any on-board SRAM we need to build the machine learning model into the firmware and load it from the flash. To do so, go to https://github.com/openmv/openmv and fork the repository. In your fork click on "Actions" and enable the workflows by clicking on the green button. + +Rename the machine learning model and the label file to fruit_detection.tflite and fruit_detection.txt respectively. In your fork, replace the built-in machine learning model under `src/lib/libtf/models` with the model you downloaded from Edge Impulse Studio. Commit the files and push the commit to the repository. It will build a new firmware automatically. + +![The model that shall be baked into the firmware needs to be stored under src/lib/libtf/models](assets/github_model_path.png) + +You can inspect the build process under "Actions". + +![In the actions section you can monitor the build process once it starts.](assets/github_actions.png) + +Once the firmware has been built you can download it from the releases section that you can find in the "Code" tab. Put the board in bootloader mode and click on the connect symbol in the OpenMV IDE. In the dialog select "Load a specific firmware". Select the firmware that you just created and flash it to the board. + +![In the release section you can find the generated firmware ready to download and install.](assets/github_releases.png) + +### Run the Script + +The final step is to run the **ei_image_classification.py** script. Open it in the OpenMV. As the model is now baked into the firmware you need to adjust the lines where it loads the model and the labels as follows: + +```python +labels, net = tf.load_builtin_model('fruit_detection') +``` + +Also, replace the print statement in the innermost for loop with the following code: + +```python +confidence = predictions_list[i][1] +label = predictions_list[i][0] +print("%s = %f" % (label[2:], confidence)) + +if confidence > 0.9 and label != "unknown": + print("It's a ", label, "!") +``` + +This code will print a message saying e.g. "It's a orange!" in case the confidence is above 90%. In the following screenshot you can see that the orange was detected with a confidence level of 0.99 which corresponds to 99%. + +![In this example the apple is detected with a 100% certainty](assets/edge_impulse_classification.png) + +Try pointing the camera of your board at any of your fruits or other objects that you used for the training and check if it can be recognized successfully. + +The complete script of the classification example is as follows: + +```python +import sensor, image, time, os, tf + +sensor.reset() # Reset and initialize the sensor. +sensor.set_pixformat(sensor.RGB565) # Set pixel format to RGB565 (or GRAYSCALE) +sensor.set_framesize(sensor.QVGA) # Set frame size to QVGA (320x240) +sensor.set_vflip(True) +sensor.set_hmirror(True) +sensor.set_windowing((240, 240)) # Set 240x240 window. +sensor.skip_frames(time=2000) # Let the camera adjust. + +labels, net = tf.load_builtin_model('fruit_detection') + +clock = time.clock() +while(True): + clock.tick() + + img = sensor.snapshot() + + # default settings just do one detection... change them to search the image... + for obj in tf.classify(net, img, min_scale=1.0, scale_mul=0.8, x_overlap=0.5, y_overlap=0.5): + print("**********\nPredictions at [x=%d,y=%d,w=%d,h=%d]" % obj.rect()) + img.draw_rectangle(obj.rect()) + # This combines the labels and confidence values into a list of tuples + predictions_list = list(zip(labels, obj.output())) + + for i in range(len(predictions_list)): + confidence = predictions_list[i][1] + label = predictions_list[i][0] + print("%s = %f" % (label, confidence)) + + if confidence > 0.9 and label != "unknown": + print("It's a", label, "!") + + print(clock.fps(), "fps") +``` + +## Conclusion + +You have learned about classification as a machine learning concept which categorizes a set of data into classes. You have also learned how supervised learning works and what quantization of a model means. Furthermore you have learned to train a custom TFLite machine learning model and deploy it to your board. diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/microphone_sensor/assets/OpenMV_spectrumAnalyzer.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/microphone_sensor/assets/OpenMV_spectrumAnalyzer.png new file mode 100644 index 0000000000..5f4dc6df21 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/microphone_sensor/assets/OpenMV_spectrumAnalyzer.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/microphone_sensor/assets/nicla-vision-microphone.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/microphone_sensor/assets/nicla-vision-microphone.png new file mode 100644 index 0000000000..6ce8ec646f Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/microphone_sensor/assets/nicla-vision-microphone.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/microphone_sensor/content.md b/content/hardware/05.nicla/boards/nicla-vision/tutorials/microphone_sensor/content.md new file mode 100644 index 0000000000..403c41c311 --- /dev/null +++ b/content/hardware/05.nicla/boards/nicla-vision/tutorials/microphone_sensor/content.md @@ -0,0 +1,192 @@ +--- +title: 'Reading Audio Samples With the Onboard Microphone' +difficulty: easy +description: 'Learn how to create a soundmeter using the built-in microphone with the Nicla Vision.' +tags: + - OpenMV + - Microphone + - Sound + - Sensor +author: Pablo MarquĂ­nez +libraries: + - name: Arduino PDM + url: https://www.arduino.cc/en/Reference/PDM +hardware: + - hardware/05.nicla/boards/nicla-vision +software: + - OpenMV + - ide-v1 + - ide-v2 + - web-editor + - cli +--- + +![Nicla Vision - microphone](assets/nicla-vision-microphone.png) + +## Overview + +In this tutorial you will use the **Arduino Nicla Vision** board to get the microphone (MP34DT06JTR) readings and change the LED brightness. + +## Goals + +- Get the microphone data. +- Use the PDM(Pulse-density modulation) library. +- Print the microphone values in the Serial Monitor. +- Change RGB blinking speed with the last microphone reading. (Arduino IDE) +- Show the values on a spectrum analyzer (only with openMV) + +### Required Hardware and Software + +- [Nicla Vision board](https://store.arduino.cc/products/nicla-vision) +- Latest mbed Core version +- Latest openMV IDE version + +## Set Up + +To check that you correctly set up the board please visit our [Getting Started Guide](https://docs.arduino.cc/tutorials/nicla-vision/getting-started) for both **OpenMV** and **Arduino** instructions. + +## Instructions + +### OpenMV + +Open the script by going to **Examples > Arduino > NanoRP2040 > Audio > Audio_fft.py**. + +***Using the same sketch as the NanoRP2040, because both boards access the microhpone in the same way*** + +Make sure the board is connected, if the board is connected to OpenMV you should see a green play button in the bottom left corner of the window. If you do not see this icon, try pressing the connect button in the bottom left corner. If there still is some issue to connect the board take another look at the getting started guide. + +When the script is running, you will see an spectrum analyzer in the top right panel that reflects the audio readings input. Try making some noise and see how it reacts. + +![OpenMV IDE - Spectrum analyzer](assets/OpenMV_spectrumAnalyzer.png) + +### Arduino + +#### Setting Up the Sketch + +We will edit the example from the mbed Core, go to **Examples > PDM > PDMSerialPlotter** and save it into your sketchbook. + +You can run the sketch to see the result, it will show the data that the microphone is getting on the **Serial Plotter**. + +#### Controlling the Blinking LED + +Now that you can get the microphone data, let's control the built-in RGB LED and change the speed of its blinking depending on the values by changing the blinking time to the last reading of the microphone, the blink will be slow if the sound is loud, and fast if it is quiet. + +You can access the example sketch at **Examples > PDM > PDMSerialPlotter** and then edit as shown in this tutorial. +Or find the full edited sketch in our **Arduino_Pro_Tutorials** library. + +#### Complete Sketch + +```arduino + /* + This example reads audio data from the on-board PDM microphones, and prints + out the samples to the Serial console. The Serial Plotter built into the + Arduino IDE can be used to plot the audio data (Tools -> Serial Plotter) + Circuit: + - Arduino Nicla Vision, or + - Arduino Nano 33 BLE board, or + - Arduino Nano RP2040 Connect, or + - Arduino Portenta H7 board plus Portenta Vision Shield + This example code is in the public domain. + */ + + + #include + + // default number of output channels + static const char channels = 1; + + // default PCM output frequency + static const int frequency = 16000; + + // Buffer to read samples into, each sample is 16-bits + short sampleBuffer[512]; + + // Number of audio samples read + volatile int samplesRead; + + // Blinking + bool state = false; + int timeStart = 0; + + void setup() { + Serial.begin(9600); + pinMode(LEDB, OUTPUT); + + while (!Serial); + + // Configure the data receive callback + PDM.onReceive(onPDMdata); + + // Optionally set the gain + // Defaults to 20 on the BLE Sense and 24 on the Portenta Vision Shield + // PDM.setGain(30); + + // Initialize PDM with: + // - one channel (mono mode) + // - a 16 kHz sample rate for the Arduino Nano 33 BLE Sense + // - a 32 kHz or 64 kHz sample rate for the Arduino Portenta Vision Shield + if (!PDM.begin(channels, frequency)) { + Serial.println("Failed to start PDM!"); + while (1); + } + + + } + + void loop() { + // Wait for samples to be read + if (samplesRead) { + + // Print samples to the serial monitor or plotter + for (int i = 0; i < samplesRead; i++) { + if (channels == 2) { + Serial.print("L:"); + Serial.print(sampleBuffer[i]); + Serial.print(" R:"); + i++; + } + Serial.println(sampleBuffer[i]); + } + + // Clear the read count + samplesRead = 0; + + if (millis() - timeStart > sampleBuffer[2]) { + digitalWrite(LEDB, state); + state = !state; + } + } + } + + /** + Callback function to process the data from the PDM microphone. + NOTE: This callback is executed as part of an ISR. + Therefore using `Serial` to print messages inside this function isn't supported. + * */ + void onPDMdata() { + // Query the number of available bytes + int bytesAvailable = PDM.available(); + + // Read into the sample buffer + PDM.read(sampleBuffer, bytesAvailable); + + // 16-bit, 2 bytes per sample + samplesRead = bytesAvailable / 2; + } +``` + + +### Testing It Out + +After you have successfully verified and uploaded the sketch to the board, open the Serial Monitor from the menu on the left. You will now see the new values printed. + +If you want to test it, the only thing you need to do is to speak or play some sounds close to the board and see how the blinking of the RGB LED changes based on the input. + +### Troubleshoot + +- In case the Serial Monitor freezes, unplug and then plug the board into your computer again, now try to upload the sketch +- If the sketch is not working, try to double tap the reset button and upload the sketch once again. + +## Conclusion + +You have learned how to use the Arduino IDE and OpenMV to get data from the microphone and then use it to change the RGB LED on the board. This can for example be used as an alarm system to wake the board up and take a screenshot with the Camera. \ No newline at end of file diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/nicla-vision-imu/assets/nicla-vision-imu.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/nicla-vision-imu/assets/nicla-vision-imu.png new file mode 100644 index 0000000000..9c25f56da2 Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/nicla-vision-imu/assets/nicla-vision-imu.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/nicla-vision-imu/assets/nicla_vision_acceleration.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/nicla-vision-imu/assets/nicla_vision_acceleration.png new file mode 100644 index 0000000000..60984b8d1d Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/nicla-vision-imu/assets/nicla_vision_acceleration.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/nicla-vision-imu/assets/nicla_vision_gyroscope.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/nicla-vision-imu/assets/nicla_vision_gyroscope.png new file mode 100644 index 0000000000..a0ae9273bf Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/nicla-vision-imu/assets/nicla_vision_gyroscope.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/nicla-vision-imu/content.md b/content/hardware/05.nicla/boards/nicla-vision/tutorials/nicla-vision-imu/content.md new file mode 100644 index 0000000000..435388d827 --- /dev/null +++ b/content/hardware/05.nicla/boards/nicla-vision/tutorials/nicla-vision-imu/content.md @@ -0,0 +1,204 @@ +--- +title: 'Accessing IMU Data on Nicla Vision' +difficulty: easy +compatible-products: [nicla-vision] +description: 'Learn how to access the data from the accelerometer and gyroscope that comes with the LSM6DSOXTR IMU module.' +tags: + - Gyroscope + - Accelerometer +author: 'Benjamin DannegĂ„rd' +libraries: + - name: Arduino LSM6DSOX + url: https://www.arduino.cc/en/Reference/ArduinoLSM6DSOX +hardware: + - hardware/05.nicla/boards/nicla-vision +software: + - ide-v1 + - ide-v2 + - web-editor +--- + +## Overview + +In this tutorial, we will learn how to access the gyroscope and accelerometer that is on the Nicla Vision board. For this, we will be using the [Arduino_LSMDS63](https://www.arduino.cc/en/Reference/ArduinoLSM6DSOX) library and the Arduino IDE. Printing the values in the serial monitor of the Arduino IDE. + +## Goals + +The goals of this project are: + +- Read accelerometer data. +- Read gyroscope data. +- Print the data in the Serial Monitor. + +### Hardware & Software Needed + +- Arduino IDE ([online](https://create.arduino.cc/) or [offline](https://www.arduino.cc/en/main/software)). +- [LSM6DSOX library](https://github.com/arduino-libraries/Arduino_LSM6DSOX) +- [Nicla Vision board](https://store.arduino.cc/products/nicla-vision) + +## IMU (Inertial Measurement Unit) + +An IMU is a component that exists of different sensors that records data such as specific force, angular rate, orientation. The Nicla Visions IMU has a **gyroscope** and a **accelerometer.** On the image below you can see exactly where on the board the IMU is located. + +![Placement of IMU on the Nicla Vision](assets/nicla-vision-imu.png) + +### Accelerometer & Gyroscope + +An accelerometer is an electromechanical device used to measure acceleration forces. Such forces may be static, like the continuous force of gravity or, as is the case with many mobile devices, dynamic to sense movement or vibrations. + +![Illustration of Nicla Vision accelerometer axis.](assets/nicla_vision_acceleration.png) + +A gyroscope sensor is a device that can measure and maintain the orientation and angular velocity of an object. Gyroscopes are more advanced than accelerometers, as they can measure the tilt and lateral orientation of an object, whereas an accelerometer can only measure its linear motion. Gyroscope sensors are also called "Angular Rate Sensors" or "Angular Velocity Sensors". Measured in degrees per second, angular velocity is the change in the rotational angle of the object per unit of time. + +![Illustration of Nicla Vision gyroscope axis.](assets/nicla_vision_gyroscope.png) + +In this tutorial, we will use the gyroscope as an indicator for the direction of the force that is applied to the board. We will also use the accelerometer as a "level" that will provide information about the position of the board. With this application we will be able to read what the relative position of the board is, as well as the degrees by tilting the board up, down, left or right. The results will be visible through the Serial Monitor. + +## Instructions + +### Setting up the Arduino IDE + +Make sure the latest Nicla Core is installed in the Arduino IDE. **Tools > Board > Board Manager...**. Here we need to look for the **Arduino Mbed OS Nicla Boards** and install it. Now we need to install the library needed for the IMU. Go to **Tools > Manage libraries..**, and search for **Arduino_LSM6DS3** and install it. + +### IMU Sketch + +The full sketch can be found at the end of the **Instructions** section. Upload the sketch to the board. + +To use the IMU we first include the library. To make it easier with the values from the IMU, we create a variable for each axis. + +```arduino +#include + +float Ax, Ay, Az; +float Gx, Gy, Gz; + +``` + +To initializes the library we need to call `IMU.begin()`. When the IMU is inistialized, we can quickly check the sample rates of the sensors. Calling `IMU.accelerationSampleRate()` and `IMU.gyroscopeSampleRate()` will read the sampling rate of the respective sensor in Hz. + +```arduino +void setup() { + Serial.begin(9600); + + while(!Serial); + + if (!IMU.begin()) { + Serial.println("Failed to initialize IMU!"); + while (1); + } + + Serial.print("Accelerometer sample rate = "); + Serial.print(IMU.accelerationSampleRate()); + Serial.println("Hz"); + Serial.println(); + + Serial.print("Gyroscope sample rate = "); + Serial.print(IMU.gyroscopeSampleRate()); + Serial.println("Hz"); + Serial.println(); + +} +``` + +In the loop of the sketch we can check the sensors to see if there is data available on the IMU sensors, using `IMU.accelerationAvailable()` and `IMU.gyroscopeAvailable()`. Then we can call `IMU.readAcceleration(Ax, Ay, Az)` to read the accelerometer. It will return the value of the **x**, **y** and **z** axis and update the variables `Ax`, `Ay` and `Az`. We do the same for the gyroscope, formatting it in the serial monitor so it will be a bit easier to read the data. The data is being printed with an interval of 500 milliseconds. This can be adjusted by changing the line `delay(500)` at the bottom of the sketch. + +```arduino +void loop() { + + if (IMU.accelerationAvailable()) { + IMU.readAcceleration(Ax, Ay, Az); + + Serial.println("Accelerometer data: "); + Serial.print(Ax); + Serial.print('\t'); + Serial.print(Ay); + Serial.print('\t'); + Serial.println(Az); + Serial.println(); + } + + if (IMU.gyroscopeAvailable()) { + IMU.readGyroscope(Gx, Gy, Gz); + + Serial.println("Gyroscope data: "); + Serial.print(Gx); + Serial.print('\t'); + Serial.print(Gy); + Serial.print('\t'); + Serial.println(Gz); + Serial.println(); + } + +delay(500); + +} +``` + +### Testing It Out + +After successfully uploading the code to the board, we will need to open the Serial Monitor to initialize the program. Once we open it, data will start printing. + +### Complete Sketch + +```arduino +#include + +float Ax, Ay, Az; +float Gx, Gy, Gz; + +void setup() { + Serial.begin(9600); + + while(!Serial); + + if (!IMU.begin()) { + Serial.println("Failed to initialize IMU!"); + while (1); + } + + Serial.print("Accelerometer sample rate = "); + Serial.print(IMU.accelerationSampleRate()); + Serial.println("Hz"); + Serial.println(); + + Serial.print("Gyroscope sample rate = "); + Serial.print(IMU.gyroscopeSampleRate()); + Serial.println("Hz"); + Serial.println(); + +} + +void loop() { + + if (IMU.accelerationAvailable()) { + IMU.readAcceleration(Ax, Ay, Az); + + Serial.println("Accelerometer data: "); + Serial.print(Ax); + Serial.print('\t'); + Serial.print(Ay); + Serial.print('\t'); + Serial.println(Az); + Serial.println(); + } + + if (IMU.gyroscopeAvailable()) { + IMU.readGyroscope(Gx, Gy, Gz); + + Serial.println("Gyroscope data: "); + Serial.print(Gx); + Serial.print('\t'); + Serial.print(Gy); + Serial.print('\t'); + Serial.println(Gz); + Serial.println(); + } + +delay(500); + +} +``` + +## Conclusion + +In this tutorial we have learned how to use the **Arduino_LSM6DSOX** library to access the IMU on the Nicla Vision. With this we learned how to print the gyroscope and accelerometer data in the Arduino IDE serial monitor. \ No newline at end of file diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/proximity/assets/nicla-vision-tof.png b/content/hardware/05.nicla/boards/nicla-vision/tutorials/proximity/assets/nicla-vision-tof.png new file mode 100644 index 0000000000..322d045c9a Binary files /dev/null and b/content/hardware/05.nicla/boards/nicla-vision/tutorials/proximity/assets/nicla-vision-tof.png differ diff --git a/content/hardware/05.nicla/boards/nicla-vision/tutorials/proximity/content.md b/content/hardware/05.nicla/boards/nicla-vision/tutorials/proximity/content.md new file mode 100644 index 0000000000..c02536b8de --- /dev/null +++ b/content/hardware/05.nicla/boards/nicla-vision/tutorials/proximity/content.md @@ -0,0 +1,173 @@ +--- +title: Proximity Detection with Arduino Nicla Vision +difficulty: easy +tags: [Proximity, Time Of Flight, Blink] +description: Learn how to use the proximity sensor to vary the speed of the LED's blink. +author: Pablo MarquĂ­nez +libraries: + - name: VL53L1X + url: https://github.com/pololu/vl53l1x-arduino +hardware: + - hardware/05.nicla/boards/nicla-vision +software: + - ide-v1 + - ide-v2 + - web-editor + - cli +--- + +## Overview + +In this tutorial you will use the Nicla Vision to detect proximity, thanks to the Time of Flight (ToF) sensor **VL53L1X**. + +This tutorial goes through how to create a sketch that will blink the built-in RGB LED and control the speed of its blink with the proximity values. It can be useful for future projects where there is a need to control the camera only when something is detected in front of the sensor. + +***The Arduino sketch shown is available inside the `Arduino_Pro_Tutorials` library by going to Examples > Nicla Vision > Proximity_Blink*** + +## Goals +The goals of this project are: + - Set up the needed libraries + - Learn how to interact with the proximity readings + - Change the RGB values of the LED + +### Required Hardware and Software + +* [Nicla Vision board](https://store.arduino.cc/products/nicla-vision) +* VL53L1X library (Available in the Library Manager) + +## Instructions + +### Time of Flight Sensor + +![Arduino Nicla Vision - Time of Flight sensor](assets/nicla-vision-tof.png) + +To make sure that the sketch works properly, the latest versions of the **Arduino mbed Core** and the **VL53L1X library** needs to be installed. The **Arduino mbed Core** can be found in the **boards manager...** and the **VL53L1X library** can be found in the **Library manager**, both can be found inside the Arduino IDE. + +### Include the Needed Libraries and Objects Declaration + +First of all declare the sensor's class so you can access it later on in your sketch. We use variables to control the time elements in the sketch. This will make sure that the readings stay accurate over time. + +```cpp +#include "VL53L1X.h" +VL53L1X proximity; + +bool blinkState = false; +int reading = 0; +int timeStart = 0; +int blinkTime = 2000; +``` + +### Initialize the Proximity Sensor and the LED + +Inside the setup you need to initialize and configure the proximity sensor. Also the RGB LED needs to be set as an output to make it light up and enable us to change its behavior. + +***The LEDs are accessed in the same way as on the Portenta H7: LEDR, LEDG and LEDB*** + +```cpp + void setup(){ + Serial.begin(115200); + Wire1.begin(); + Wire1.setClock(400000); // use 400 kHz I2C + proximity.setBus(&Wire1); + + pinMode(LEDB,OUTPUT); + digitalWrite(LEDB, blinkState); + + if (!proximity.init()){ + Serial.println("Failed to detect and initialize sensor!"); + while (1); + } + + proximity.setDistanceMode(VL53L1X::Long); + proximity.setMeasurementTimingBudget(10000); + proximity.startContinuous(10); + } +``` + +***Make sure you initialize `Wire1`, set the clock speed to 400kHz and set the bus pointer to `Wire1`, it won't work if you don't add these setting*** + +### Control the Speed of the Blink + +The sketch is going to get the reading on every loop, store it and then the state of the LED will change, until the time is up and another proximity reading is taken. + +```cpp + void loop(){ + reading = proximity.read(); + Serial.println(reading); + + if (millis() - timeStart >= reading){ + digitalWrite(LEDB, blinkState); + timeStart = millis(); + + blinkState = !blinkState; + } + } +``` + +### Complete Sketch + +```cpp +#include "VL53L1X.h" +VL53L1X proximity; + +bool blinkState = false; +int reading = 0; +int timeStart = 0; +int blinkTime = 2000; + +void setup() { + Serial.begin(115200); + Wire1.begin(); + Wire1.setClock(400000); // use 400 kHz I2C + proximity.setBus(&Wire1); + + + pinMode(LEDB, OUTPUT); + digitalWrite(LEDB, blinkState); + + if (!proximity.init()) { + Serial.println("Failed to detect and initialize sensor!"); + while (1); + } + + proximity.setDistanceMode(VL53L1X::Long); + proximity.setMeasurementTimingBudget(10000); + proximity.startContinuous(10); +} + +void loop() { + reading = proximity.read(); + Serial.println(reading); + + if (millis() - timeStart >= reading) { + digitalWrite(LEDB, blinkState); + timeStart = millis(); + + blinkState = !blinkState; + } +} +``` + +## API +| Command | Details | type | +| :----------------------------------- | :----------------------------------------------------------: | :---------------- | +| setAddress(newAddress) | Change the I2C sensor's address (Mandatory to set it to `Wire1`) | `void` | +| getAddress() | Get the Sensor's I2C address | `uint8_t` | +| init() | Configures the sensor and needed data. Like the usual begin()| `void` | +| setDistanceMode(mode) | Set the distance mode (check the datasheet). Available modes `VL53L1X::Short`, `VL53L1X::Medium`, `VL53L1X::Long`, `VL53L1X::Unknown` | `void` | +| getDistanceMode() | Returns the mode that has been set. Available modes `VL53L1X::Short`, `VL53L1X::Medium`, `VL53L1X::Long`, `VL53L1X::Unknown`| `enum DistanceMode ` | +| setMeasurementTimingBudget(uSeconds) | Set the time to get the measure, greater the value, better precision. In micro seconds. | `void` | +| getMeasurementTimingBudget() | Get the measure timing value in micro seconds. | `uint32_t` | +| startContinuous() | Start the non stop readings, set the period inside the parameter, after that time you will get the reading. | `void` | +| stopContinuous() | Stop the non stop measurements. | `void` | +| read() | Get the last reading from the continuous mode. | `void` | +| readSingle() | Trigger one reading and get its result. | `uint16_t` | +| dataReady() | Returns if the sensor has new data available. | `bool` | +| setTimeout(mSeconds) | Configure the milliseconds the sensor will wait in case it is not getting the proper reading to abort, and continue with a new one, 0 disables it. | `void` | +| getTimeout() | Get the configured timeout value. | `uint16_t` | +| timeoutOccurred() | Returns true whenever the sensor had a timeout. | `bool` | + + +## Conclusion + +In this tutorial we went through how to get readings from the ToF sensor. And how use these readings to change how the built-in LED behaves. At the end of the tutorial you can also find a reference list for the ToF library. \ No newline at end of file