I have been asked to design an IP camera surveillance service for a client - they are not keen on wired ethernet and hope to use Wi-Fi connected cameras. There could be more than 10 cameras around a farm, both indoors and outdoors.
I have looked a the data rates for 10x HD cameras using cctvcalculator.net and I see 42Mbps for h.264 video and storage consumed at 5.2MB/s (that’s 13.5TB/month).
The Wi-Fi network usage rate worries me.
802.11n is the minimum I would dare use, and preferably 802.11ac wave 2 (Wi-Fi 5) or 802.11ax (Wi-Fi 6). The ac and ax standards are not something I am familiar with, especially multi-user MIMO parts of those.
As the bandwidth saturation increases the likelihood of collisions and retries increases, at which point the throughput with fall off. My question is at what level of saturation will packet retries become dominant and throughput falls below the needed 42Mbps?
Do you have experience of installing camera networks and use Wi-Fi? Did it work ok?
What things did you take note of. What problems did you get?
Is it an ok, bad or terrible idea to use Wi-Fi for cameras?
Don’t know the answers to your specific queries, but if they don’t want to run ethernet cable, how will the cameras be powered ? Perhaps some kind of powerline solution might be better than wifi, wifi just sounds like it will be terrible if not immediately then probably shortly after.
A place I frequent uses wired cameras with point to point wireless bridges to cover the distances. Otherwise they would have had to run fiber (which would probably have been easy when they installed the power).
Usually a PoE switch can inject DC 48V power to the Ethernet cable. After 100 meters transmitting, the remain voltage will drop down to DC 39V which can still satisfied the requirement. However if the power is sending over 100 meters without proper design, the correlated power loss will become issue.