Skip to content

Commit a79ab4c

Browse files
committed
Application note content update
1 parent 515791d commit a79ab4c

File tree

3 files changed

+145
-74
lines changed

3 files changed

+145
-74
lines changed

content/hardware/04.pro/boards/portenta-x8/tutorials/16.edge-ai-docker-container/content.md

+145-74
Original file line numberDiff line numberDiff line change
@@ -481,10 +481,11 @@ When docker container is selected as the deployment option, he system generates
481481

482482
```bash
483483
docker run --rm -it \
484-
-p 1337:1337 \
485-
public.ecr.aws/xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx \
486-
--api-key ei_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx \
487-
--run-http-server 1337
484+
-p 1337:1337 \
485+
public.ecr.aws/g7a8t7v6/inference-container:54c41c953772c053251634beec512d15f41073d1 \
486+
--api-key ei_38f54891ee062462d3d28bd9648dd6ae766b2093796e4d384f0ae2c0e56d0a5b \
487+
--run-http-server 1337 \
488+
--model-type int8
488489
```
489490

490491
This command pulls the latest model from the Edge Impulse container registry and runs an HTTP server on port `1337`, allowing external applications to send sensor data and receive model predictions. It uses an API key to authenticate requests.
@@ -528,38 +529,67 @@ The M4 core needs to have the **`rpc-flow-sensor.ino`** uploaded to collect and
528529
#include <Arduino.h>
529530
#include <RPC.h>
530531
#include <SerialRPC.h>
531-
#include <FlowSensor.h>
532+
#include <FlowSensor.h> // Include the FlowSensor library
533+
534+
// Define Serial Debug Output (For Portenta X8 Carrier Usage)
535+
#if defined(ARDUINO_PORTENTA_X8)
536+
#define SerialDebug Serial1 // Use Serial1 for debugging on Portenta X8 Carrier
537+
#else
538+
#define SerialDebug Serial
539+
#endif
532540
533541
// Define Flow Sensor Type and Pin
534542
#define SENSOR_TYPE YFS201
535-
#define SENSOR_PIN PC_7 // Flow sensor signal pin
543+
#define SENSOR_PIN PC_7 // Flow sensor signal pin
536544
545+
// Create Flow Sensor Object
537546
FlowSensor flowSensor(SENSOR_TYPE, SENSOR_PIN);
538547
539548
void count() {
540549
flowSensor.count();
541550
}
542551
543-
// Function to calculate and return flow rate (for RPC)
552+
// Function to get Flow Rate (for RPC)
544553
float getFlowRate() {
545554
flowSensor.read();
546555
float flowRate = flowSensor.getFlowRate_m(); // Get flow rate in L/min
547556
548557
if (isnan(flowRate) || isinf(flowRate)) {
549-
return 0.0; // Default to 0 if no valid reading
558+
return 0.0; // Default to 0 if no valid reading
550559
}
551560
return flowRate;
552561
}
553562
554563
void setup() {
555-
flowSensor.begin(count);
564+
SerialDebug.begin(115200);
565+
//while (!SerialDebug);
566+
567+
SerialDebug.println("Starting Flow Sensor ");
568+
569+
flowSensor.begin(count); // Initialize the Flow Sensor
556570
557-
// Register the RPC function
558-
RPC.bind("flow_rate", getFlowRate);
571+
// RPC Binding: Function to get flow rate
572+
RPC.bind("flow_rate", [] {
573+
return 11;
574+
});
575+
576+
// RPC Binding: Receive classification results
577+
RPC.bind("classification", [](std::string const& input) {
578+
SerialDebug.print("Classification Received: ");
579+
SerialDebug.println(input.c_str());
580+
return 1;
581+
});
582+
583+
SerialDebug.println("Setup complete.");
559584
}
560585
561586
void loop() {
562-
// Nothing needed in loop, RPC handles function calls when requested
587+
float flowRate = getFlowRate();
588+
SerialDebug.print("Flow Rate: ");
589+
SerialDebug.print(flowRate);
590+
SerialDebug.println(" L/min");
591+
592+
delay(1000);
563593
}
564594
```
565595

@@ -604,135 +634,145 @@ services:
604634
command: ["inference", "1337"]
605635
606636
inference:
607-
image: public.ecr.aws/g7a8t7v6/inference-container:48cd3f0d76c701a6070a27a8d9487d1733c155aa
637+
image: public.ecr.aws/g7a8t7v6/inference-container:54c41c953772c053251634beec512d15f41073d1
608638
restart: unless-stopped
609639
ports:
610640
- 1337:1337
611641
networks:
612642
sensorfusion:
613643
aliases:
614644
- ei-inference
645+
environment:
646+
EI_MODEL_VERSION: ${EI_MODEL_VERSION:-float32} # Default to float32 if not set
615647
command: [
616648
"--api-key", "ei_38f54891ee062462d3d28bd9648dd6ae766b2093796e4d384f0ae2c0e56d0a5b",
617649
"--run-http-server", "1337",
618650
"--force-target", "runner-linux-aarch64",
619-
"--model-variant", "int8"]
651+
"--model-variant", "${EI_MODEL_VERSION}"
652+
]
620653
```
621654

622655
The compose file will require the **container, arguments and port values** that were generated when deployed the machine learning model as explained [here](#deployment-and-real-time-inference).
623656

624657
The `main.py` script receives flow sensor data from the M4 core and sends it to the inference container.
625658

626659
```python
660+
#!/usr/bin/env python3
627661
import os
628662
import time
663+
import requests # type: ignore
664+
import signal
665+
import sys
629666
import json
630667
import argparse
631-
from msgpackrpc import Address as RpcAddress, Client as RpcClient, error as RpcError
632668

633-
# Retrieve M4 Proxy settings from environment variables (or use defaults)
634-
m4_proxy_host = os.getenv("M4_PROXY_HOST", "m4proxy")
635-
m4_proxy_port = int(os.getenv("M4_PROXY_PORT", "5001"))
636-
m4_proxy_address = RpcAddress(m4_proxy_host, m4_proxy_port)
669+
from msgpackrpc import Address as RpcAddress, Client as RpcClient, error as RpcError # type: ignore
637670

638-
# Define the single sensor we are using
639-
sensors = ("flow_rate",) # Tuple with one element to keep extend() valid
671+
m4_proxy_host = os.getenv('M4_PROXY_HOST', 'm4proxy')
672+
m4_proxy_port = int(os.getenv('M4_PROXY_PORT', '5001'))
673+
m4_proxy_address = RpcAddress(m4_proxy_host, m4_proxy_port)
674+
model_type = os.getenv("EI_MODEL_VERSION", "float32")
640675

641676
def get_sensors_data_from_m4():
677+
"""Get data from the M4 via RPC (MessagePack-RPC)
678+
679+
The Arduino sketch on the M4 must implement the methods contained in the `sensors`
680+
list and returning values from the attached sensor.
681+
642682
"""
643-
Get flow sensor data from the M4 via RPC (MessagePack-RPC).
644-
The Arduino sketch on the M4 must implement the "flow_rate" method.
645-
"""
683+
data = {}
684+
sensors = ('flow_rate')
646685
try:
647-
get_value = lambda value: RpcClient(m4_proxy_address).call(value) # Ensure this returns a value
648-
data = [get_value(sensor) for sensor in sensors] # Ensure it is a list
649-
650-
print(f"Sensor Data: {data}") # Debug output
651-
return data
686+
# MsgPack-RPC client to call the M4
687+
# We need a new client for each call
688+
get_value = lambda value: RpcClient(m4_proxy_address).call(value)
689+
data = {get_value(sensors) for sensors in sensors}
652690

653691
except RpcError.TimeoutError:
654-
print("Unable to retrieve sensor data from the M4: RPC Timeout")
655-
return [] # Ensure an empty list is returned instead of `None`
692+
print("Unable to retrieve sensors data from the M4: RPC Timeout - A")
693+
return data
656694

657695
def get_sensors_and_classify(host, port):
658-
"""
659-
Collect sensor data and send it for classification to Edge Impulse.
660-
"""
696+
# Construct the URL for the Edge Impulse API for the features upload
661697
url = f"http://{host}:{port}/api/features"
662698

663699
while True:
664700
print("Collecting 400 features from sensors... ", end="")
665-
666-
data = {
667-
"features": [],
668-
"model_type": "int8" # Force quantized inference mode
669-
}
701+
702+
data = { "features": [] }
670703
start = time.time()
671-
672-
for _ in range(100): # Collect data in chunks
673-
sensor_values = get_sensors_data_from_m4()
674-
675-
if not isinstance(sensor_values, list): # Validate that we get a list
676-
print(f"Error: Expected list but got {type(sensor_values)} with value {sensor_values}")
677-
sensor_values = [] # Default to an empty list
678-
679-
data["features"].extend(sensor_values) # Avoid TypeError
680-
681-
time.sleep(100e-6) # Small delay to match sampling rate
682-
704+
for i in range(100):
705+
sensors = get_sensors_data_from_m4()
706+
if sensors:
707+
print("flow: ", data.get('flow_rate', 'N/A'))
708+
data["features"].extend(sensors)
709+
time.sleep(100e-6)
683710
stop = time.time()
684711
print(f"Done in {stop - start:.2f} seconds.")
685-
712+
# print(data)
713+
686714
try:
687715
response = requests.post(url, json=data)
688716
except ConnectionError:
689717
print("Connection Error: retrying later")
690718
time.sleep(5)
691-
continue
719+
break
692720

693721
# Check the response
694722
if response.status_code != 200:
695-
print(f"Failed to submit features. Status Code: {response.status_code}")
696-
continue
697-
698-
print("Successfully submitted features.")
699-
700-
# Process the JSON response to extract classification results
701-
response_data = response.json()
702-
classification = response_data.get("result", {}).get("classification", {})
723+
print(f"Failed to submit the features. Status Code: {response.status_code}")
724+
break
725+
726+
print("Successfully submitted features. ", end="")
703727

704-
print(f"Classification: {classification}")
728+
# Process the JSON response to extract the bounding box with the maximum value
729+
data = response.json()
730+
classification = data['result']['classification']
731+
print(f'Classification: {classification}')
705732

733+
# Find the class with the maximum value
706734
if classification:
707735
label = max(classification, key=classification.get)
708736
value = classification[label]
709737

710738
print(f"{label}: {value}")
711739

712-
request_data = json.dumps({"label": label, "value": value})
740+
# Create a JSON string with the label and value
741+
request_data = json.dumps({'label': label, 'value': value})
742+
743+
res = 0
713744

714745
try:
715746
client = RpcClient(m4_proxy_address)
716-
result = client.call("classification", request_data)
717-
print(f"Sent to {m4_proxy_host}:{m4_proxy_port}: {request_data}. Result: {result}")
747+
res = client.call('classification', request_data)
718748
except RpcError.TimeoutError:
719-
print("Unable to send classification data to M4: RPC Timeout.")
749+
print("Unable to retrieve data from the M4: RPC Timeout. - B")
750+
751+
print(f"Sent to {m4_proxy_host} on port {m4_proxy_port}: {request_data}. Result is {res}.")
720752
else:
721753
print("No classification found.")
722754

755+
723756
if __name__ == "__main__":
724-
parser = argparse.ArgumentParser(description="Get flow sensor data and send it to inference container for classification")
757+
parser = argparse.ArgumentParser(description="Get 1 second of sensors data and send to inference container for classification")
725758
parser.add_argument("host", help="The hostname or IP address of the inference server")
726759
parser.add_argument("port", type=int, help="The port number of the inference server")
727760

761+
# Parse the arguments
728762
args = parser.parse_args()
729763

730-
print("Classifying Flow Sensor Data with AI... Press Ctrl+C to stop.")
764+
# Signal handler to handle Ctrl+C (SIGINT) gracefully
765+
def signal_handler(_sig, _frame):
766+
print("Exiting...")
767+
sys.exit(0)
731768

732-
try:
733-
get_sensors_and_classify(args.host, args.port)
734-
except KeyboardInterrupt:
735-
print("Exiting gracefully...")
769+
# Register signal handler for graceful exit on Ctrl+C
770+
signal.signal(signal.SIGINT, signal_handler)
771+
772+
print("Classifying Flow Sensor Data with AI... Press Ctrl+C to stop.")
773+
print(f"Running model type: {model_type}")
774+
# Run the capture, upload, and process function
775+
get_sensors_and_classify(args.host, args.port)
736776
```
737777

738778
The complete project files can be downloaded in [this section](#download-the-project-code).
@@ -771,6 +811,20 @@ docker compose logs -f -n 10
771811

772812
The system will start to collect flow sensor data, process it with the Edge Impulse model inside a Docker container and detect anomalies.
773813

814+
For more controlled execution:
815+
816+
```bash
817+
docker compose up -d
818+
```
819+
820+
```bash
821+
docker exec -it <container-id> sh
822+
```
823+
824+
```bash
825+
python3 -u /app/python/main.py ei-inference 1337
826+
```
827+
774828
### Arduino Cloud Integration
775829

776830
Now that the Portenta X8 is running the Docker container with the trained inference model, the next step is integrating Arduino Cloud for active monitoring and logging of anomaly detection results.
@@ -926,19 +980,22 @@ services:
926980
command: ["inference", "1337"]
927981
928982
inference:
929-
image: public.ecr.aws/g7a8t7v6/inference-container:48cd3f0d76c701a6070a27a8d9487d1733c155aa
983+
image: public.ecr.aws/g7a8t7v6/inference-container:54c41c953772c053251634beec512d15f41073d1
930984
restart: unless-stopped
931985
ports:
932986
- 1337:1337
933987
networks:
934988
sensorfusion:
935989
aliases:
936990
- ei-inference
991+
environment:
992+
EI_MODEL_VERSION: ${EI_MODEL_VERSION:-float32} # Default to float32 if not set
937993
command: [
938994
"--api-key", "ei_38f54891ee062462d3d28bd9648dd6ae766b2093796e4d384f0ae2c0e56d0a5b",
939995
"--run-http-server", "1337",
940996
"--force-target", "runner-linux-aarch64",
941-
"--model-variant", "int8"]
997+
"--model-variant", "${EI_MODEL_VERSION}"
998+
]
942999
```
9431000

9441001
Once the files are updated, it needs to be pushed to the Portenta X8 again, navigate to the project directory and use the following commands:
@@ -963,6 +1020,20 @@ Once deployed, the system will begin to get flow sensor data, classify it using
9631020

9641021
![Flow data visualization with Arduino Cloud](assets/arduino-cloud-visualization-example.gif)
9651022

1023+
For more controlled execution:
1024+
1025+
```bash
1026+
docker compose --env-file .env up -d
1027+
```
1028+
1029+
```bash
1030+
docker exec -it <container-id> sh
1031+
```
1032+
1033+
```bash
1034+
python3 -u /app/python/main.py ei-inference 1337
1035+
```
1036+
9661037
This integration allows real-time anomaly detection and cloud-based monitoring, combining edge inference on Portenta X8 with Arduino Cloud analytics. Users can remotely track flow rate anomalies, set up alerts and analyze historical trends to improve predictive maintenance of the system's point of interest.
9671038

9681039
## Additional Resources

0 commit comments

Comments
 (0)