Hi Folks!
Happy New Year! OpenMV made it through 2025! Last year was intense! We did so much:
- We started the year with a bang at CES! It was amazing to be there!
- Then we launched the GENX320 Event Camera Sensor, bringing event vision within reasonable prices for the first time!
- Next, we launched a Kickstarter! Raising over $175K in 2025!
- Then, we were at the Embedded Vision Summit in May, where we gave a talk!
- After which, we launched OpenMV IDE v4.7.0 with ROMFS support and new bootloaders for all our cameras, along with AI Model support for the N6 and AE3.
- And we were at Embedded World North America!
So... what's happened since then? Well, we've been cranking, and we're finally at our next major firmware release - OpenMV Firmware v4.8.1 with OpenMV IDE v4.8.1!
New Firmware Features
Lots of new features are packed into firmware v4.8.1. Here are the major highlights:
A New CSI Module
We now have a new class-based CSI Module, which allows you to talk to multiple camera sensors at the same time! Now we can officially support our Multispectral Thermal Camera Module, which allows using an RGB camera and FLIR® Lepton® at the same time!

The new CSI module is available on all OpenMV Cam models and will replace the original sensor module. When you update to firmware v4.8.1, please update your code to use the new CSI module, as the sensor module is now deprecated. No new features will be added to the sensor module.
But, we're not stopping at just dual thermal cameras! You heard it here first! We'll be launching dual RGB and Event Vision cameras too (only for the N6).

We're going multi-sensor and multispectral for 2026! Are you ready?
GENX320 Event Camera Mode
By popular demand, we implemented event vision mode for the GENX320, which allows your OpenMV Cam to receive exact pixel-position updates along with microsecond timestamps for each update. Want to build a frequency camera that can detect the vibration frequency of every pixel in the image? Now you can!
A New USB Debug Protocol
Firmware v4.8.1, along with OpenMV IDE v4.8.1, set things up for OpenMV firmware v5.0.0 (tip of dev currently), which introduces a new USB Debug protocol for all OpenMV Cams! The new USB Debug Protocol significantly improves the performance and reliability of all OpenMV Cams with OpenMV IDE. But it offers far more than just a better data link:
- The new protocol supports the concepts of channels, which can be registered in Python. Now you can create interfaces to move data at over 15MB/s via USB on cameras like the RT1062. We had a lot of folks asking us how to stream RAW events from the GENX320 to the PC. Now we have an answer for you! Check out the new OpenMV Python repo for more information on our desktop CLI tool to control your OpenMV Cam with the new protocol! And the forums for the feature in action!
- Beyond custom channels, which can be created in Python to send/receive data at high speed from the PC, you can ALSO CREATE CUSTOM TRANSPORTS!
- Want to debug your OpenMV Cam using its UART, like the OpenMV IDE? Now you can, by creating a custom transport using a serial port! This feature allows you to command and control your OpenMV Cam over any logical interface you desire. Ethernet, WiFi, CAN, SPI, I2C, you name it!
- And... now that we've made WiFi support a standard feature of all OpenMV Cams - WiFi debug support will be making a comeback in 2026! However, this time with a rock-solid protocol beneath the hood!
Universal TinyUSB Support
Along with the new USB protocol, we can now switch almost all OpenMV Cams to use TinyUSB! This will allow us to standardize our USB stack across our cameras! TinyUSB offers a bunch of significant benefits over the STM32 USB stack; in particular, it plays well with the N6's NPU and our Octal SPI Flash, fixing some critical issues we were dealing with.
An ML Library
We've also worked through many of the details of how we plan to support Smart Phone Level AI Models running on the N6 and AE3. We've now got a solid platform that supports running AL Models from Google Mediapipe, YoloV2, YoloV5, YoloV8, and much more:

Roboflow Integration
We've also got things set up for training custom AI models, too, using RoboFlow! Once the OpenMV N6 and AE3 hit the market, you'll be able to train custom YOLOV8 models to run onboard from the cloud!
And other notable improvements
- The frame buffer management logic across all OpenMV Cams has been improved with a new queuing system.
- An embedded code profiler has been added to the firmware and OpenMV IDE, so we can now easily measure system performance as we optimize code in the future (the firmware has to be built in profiling mode to see this).
- Automatic unit testing has been added to GitHub Actions to prevent regressions now! We can now test both the Cortex-M7 and Cortex-M55 architectures by running our code in the cloud using QEMU. This helps ensure that SIMD ops are correct along with regular C code!
- And finally, we've improved the image quality of the PAG7936 and PS5520, and we've fixed numerous bugs across the platform.
Cool! For 2026, we're going to build on the momentum and work we put in for 2025! We've got a slate of new products for release this year, which we spent 2025 developing.
Kickstarter Updates
We're manufacturing the OpenMV N6 and AE3 now! Please check out our Kickstarter page for more details! The long wait is almost over! Soon, you'll be able to experience the future!

Around the Web
Concordia SAE Aero Design (Advanced Class) put the OpenMV Cam to work on autonomous payload detection and capture using AprilTag tracking.

Their system detects an AprilTag mounted on the payload and sends its image-frame offset directly to the flight controller, allowing the aircraft to autonomously center, land, and capture the target. The team integrated OpenMV with their flight stack over MAVLink, using custom Python scripts developed during testing and validation.

While autonomous payload pickup is the highest-scoring task in the competition, the team opted not to attempt capture during flights due to flight-controller stability risks and still secured 2nd place overall. With development experience in hand, they’re planning a full autonomous payload attempt at the next competition, where OpenMV cameras continue to play a key role in prototyping, testing, and vision-based autonomy.


World Championship RCJ 2025
Next up! Team Robotics is at it again! Now you can follow along their journey to the Robocup Junior World Championship!
Roboticus cataloged their week-long trip to Brazil to compete in the World Championships! We're super impressed by the work they put into showing off their story!
- Day 0: Meet the team!
- Day 1: Setup Day
- Day 2: Match Day #1
- Day 3: Match Day #2
- Day 4: Match Day #3
- Day 5: Final Match Day
Wow! Just incredible!
Anyway, that's all, folks! Back to grinding and delivering the Kickstarter!