Maker Pro
Maker Pro

Low bandwidth camera

Lightning

Oct 12, 2013
45
Joined
Oct 12, 2013
Messages
45
Hey all,

I really want to do an underwater remote vehicle for my final year project at university, the only problem I am facing is the camera data feed for first person view, more specifically the bandwidth.

Due to the attenuation of EM waves through water I want to use an ultrasonic modem to communicate with my ground station, the only problem is that this method has a limited carrier frequency (e.g. 100 kHz) and hence bandwidth would be quite low.

I had the idea to use a towed buoy on the surface that could relay the information but this method seems a little clumsy, basically I want to know what is the lowest bandwidth camera signal available?

I was thinking that there might be a low resolution greyscale data protocol out there but I haven't found anything yet because the RC community is obsessed with high resolution, 2.4 GHz 'FPV'.

Any help with this would be appreciated, thank you.
 

Gryd3

Jun 25, 2014
4,098
Joined
Jun 25, 2014
Messages
4,098
Hey all,

I really want to do an underwater remote vehicle for my final year project at university, the only problem I am facing is the camera data feed for first person view, more specifically the bandwidth.

Due to the attenuation of EM waves through water I want to use an ultrasonic modem to communicate with my ground station, the only problem is that this method has a limited carrier frequency (e.g. 100 kHz) and hence bandwidth would be quite low.

I had the idea to use a towed buoy on the surface that could relay the information but this method seems a little clumsy, basically I want to know what is the lowest bandwidth camera signal available?

I was thinking that there might be a low resolution greyscale data protocol out there but I haven't found anything yet because the RC community is obsessed with high resolution, 2.4 GHz 'FPV'.

Any help with this would be appreciated, thank you.
What hardware do you currently have or are looking into?

There is another member on here that is using a RasPi with a camera and processing the data before being transmitted over an IPv4 network.
If you can process your own image, you can downsample the video, reduce the color-depth, compress the video, or send a reduced frame per second count. You have all sorts of options there... Do you know what the bandwidth limitation is currently?
Do you have a desired frame-rate, resolution or image quality?
 

Lightning

Oct 12, 2013
45
Joined
Oct 12, 2013
Messages
45
Thanks for your reply.

I currently have an Arduino mega and a N.I. myrio and I am thinking of using an old usb webcam.
I could create a labview GUI although I would prefer to use a dedicated monitor.

I have thought about pre-processing although I wouldn't know where to begin with that.

My estimation of minimum frame rate would be around 30 Hz to keep the video fluid.

Resolution and image quality are not that important hence my earlier statement about using grey scale (black and white) to cut out RGB information from the data-stream.

Going on my other earlier statement of a carrier frequency of 100 kHz; using Nyquist the maximum data-rate can be 50 kbits/s although I would like to keep it below this if at all possible.


I am open to suggestions, perhaps you could give me some starting information as to how I would go about reducing the video stream data rate using an Arduino which is my preferred medium for onboard a submarine as the National Instruments kit is quite expensive. :)

Thank you very much.
 

Gryd3

Jun 25, 2014
4,098
Joined
Jun 25, 2014
Messages
4,098
Thanks for your reply.

I currently have an Arduino mega and a N.I. myrio and I am thinking of using an old usb webcam.
I could create a labview GUI although I would prefer to use a dedicated monitor.

I have thought about pre-processing although I wouldn't know where to begin with that.

My estimation of minimum frame rate would be around 30 Hz to keep the video fluid.

Resolution and image quality are not that important hence my earlier statement about using grey scale (black and white) to cut out RGB information from the data-stream.

Going on my other earlier statement of a carrier frequency of 100 kHz; using Nyquist the maximum data-rate can be 50 kbits/s although I would like to keep it below this if at all possible.


I am open to suggestions, perhaps you could give me some starting information as to how I would go about reducing the video stream data rate using an Arduino which is my preferred medium for onboard a submarine as the National Instruments kit is quite expensive. :)

Thank you very much.
Well.. perhaps some problems that I can't help with.
I have no FPGA experience which from what I understand they are wonderful at dealing with simple calculations with a metric truck load of data. The Arduino, which can handle more complex calculations with a smaller subset of data does not look promising for interfacing with a USB camera...
It may be worth looking into an SoC that can handle buffering the video to retransmit like a RaspberryPi, BananaPi, or Beaglebone.

I may have also done some math incorrectly... but a 50kbit/sec connection, if it is sending a minimum of 24fps will only allow for roughly 2kbit per frame. Without compression, at 8-bit per pixel, you only have a 260 square pixel area, which will give you a 16x16 video feed.
You may need to run additional ultrasonic modems in parallel to increase your throughput, or use your buey idea to send the data at the surface.
 

Lightning

Oct 12, 2013
45
Joined
Oct 12, 2013
Messages
45
Hmm, definitely something to ponder over.

I didn't do the math, I was not expecting the resolution to be so low.
That is probably why a lot of products use the buoy and others just use 100% autonomous systems.

I have been looking for an excuse to use an FPGA to practice my VHDL/Verilog programming perhaps I can have a play and see what I can come up with.

I will have to do some more research into compression and transmission but thank for your help. :)
 
Top