What we did
Show and Tell (15 minutes)
- New Grove Sensors (Speech Recognition and Alcohol)
- Azure Sphere DevKit
Emotion Mesh! Edge ML & IoT Project (2 hours, 45 minutes)
Recap
-
Talk about what we did last week
- Soldered the Argon to an Adafruit PP board
- Updated Coral board to Mendel v3
- Moved the Serial connection to UART3
- Auto-mount the SD Card to /disk1 on startup
- Trained and tuned a local emotion detection model for the Coral
This time
- Implement server to client websocket call with image name and emotion result
- Display captured result and toggle off streaming display
- Show graphical result of emotion detection (pie chart?) on web ui
- Prompt user to hit green or red buttons to “vote” on the result
- Capture result from button box and send reset to Web UI
Stretch (probably next time)
- Figure out how I am going to capture stats for captures
- Start working on neopixel strips and ultrasonic for triggering the demo
What we learned
-
I need the Myo band maybe… https://smile.amazon.com/Thalmic-Labs-Gesture-Control-Presentations/dp/B00VHWBH02/ref=smiwwwrco2gosmi3905707922?encoding=UTF8&%2AVersion%2A=1&%2Aentries%2A=0&ie=UTF8
-
Bot needs to take hsl colors (thanks @kleinfreund)
-
Bot needs a police command (solid and flashing colors)
Thanks!
- @phrakberg resubbed for 3 mo!
- @roberttables resubbed for 3 mo!