At the same time, I was also thinking about the networking side of Blobby. During my demonstration last week, someone asked a great question about how the six webcams would be able to process images simultaneously at a consistently high frame rate. When planning the project, I imagined that webcam processing would take place on a separate machine, allowing the workload to be distributed efficiently. However, I wasn't completely sure how this would work, so I spent some time this week exploring potential hardware and software setups.
First, I researched ways to connect webcams to a machine and discovered that some webcams can directly broadcast to an IP address. I started thinking that an external server, such as an IDM server, could pull the live feed from the IP, blobify it, and send the blob shapes to the host computer. The host computer could then arrange and display all the blobs.
In this setup, the webcam processing would occur outside of the website, eliminating the need for JavaScript. To quickly prototype this, I created a Python script using OpenCV that takes a webcam feed from a URL and continuously runs the MediaPipe (Python) segmentation model