Our web server is off-site, colocated at a typical data center. On-site are networked items such as a label printer and a weigh scale.
To generate a shipping label from Firefox browser, one requests a page from the server, then gets the weight from the scale. At first this is a problem: we had a Beaglebone serving HTTP requests for the weight and Firefox blocked the request. Solvable by serving weight over HTTPS with CORS header, using the self-signed cert and whitelisting the private IP address in Firefox. But for printing the label, we have no control over the HTTP server (and Zebra seems to have no plans to update their print servers): it does not serve HTTPS requests, so default Firefox blocks the XHR. Firefox does offer a way to unblock mixed content on a website temporarily, but you can't unblock *before* the request, which invalidates the label, and as soon as one navigates away from the site, the block is re-instated. So the only stable solution appears to be setting the security.mixed_content.block_active_content parameter to false. But that means allowing mixed active content on any page served anywhere on the net. So I'm wondering, first, am I missing something obvious? Or, now that Debian has thrown its lot in with Firefox, is this an issue for anyone else? Is there a reason why we can't permanently allow mixed content just for websites we control? Or allow XHR for private IP's? I think I saw a Firefox bug thread where these concerns were summarily dismissed, but there must be a lot of shops like ours out there. How do they handle this security issue? Thanks for reading. Mark