by PedroR » Thu Jun 09, 2011 5:00 pm
by PedroR
Thu Jun 09, 2011 5:00 pm
Hi all
As you probably know we've
hosted a LIVE streaming event for two 2 days using this board. (Coverage of the event here
http://robosavvy.com/forum/viewtopic.php?p=31429#31429 )
We've compiled the kernel following the steps we already posted above and installed the MJPEG streamer package.
Once you get the kernel compiled, you have access to a
whole repository of software, ready to install using opkg.
To prepare for the event,
using opkg we've installed:
- ssh server
- sftp server
- pureftpd
- The mjpeg streamer
So no custom compilation, toolchain or any hack was needed. We were thrilled to learn about this as this makes the board really easy to use, almost like any other Linux distro.
As for the streaming event, we've used a Microsoft LifeCam HD which was immediatelly recognized by the board.
The kernel was compiled with Debug messages being sent to the console so whenever there are important system changes you get a message which helps during this debug stage.
We managed to get the stream going at 424x240 at 5fps. CPU usage was about 90% during the whole process with the m-jpeg streamer taking the whole 90%.
The mjpeg streamer also includes an embedded webserver so the board was serving a website where you could watch the Live stream in different ways (still pictures, JAVA applet, JavaScript) and we were also uploading the stream to Veetle.
The UPs of our experience were:
- The image/stream was actualy very decent (you don't get the HD details) but both website access and image fluidity were at an excellent level (I couldn't even tell we were doing only 5fps)
- Setting up the whole system was a breeze with opkg giving us access to a wide range of software packages.
For example, we needed to modify a few webpages and we did so over SFTP .
- Reponsiveness of the system was very good. Even though the mjpeg process was taking 90% of the CPU, the remaining CPU time was enough to serve the webpages from the board.
- Console Access to the board is very easy with the built in COMM port and the supplied USB to TTL serial cable.
The DOWNs:
- From time to time the the mjpeg streamer aborted with a segmentation fault. We were unable to track down the reason but it seemed to be very random.
Despite the SIGSEGVs from the mjpeg streamer, the board never crashed nor did we experience any kernel panic or loss of functionality. Whenever the mjpeg died, we went in through SSH and restarted it.
I don't how the experience was for people outside our office as we have limited upload bandwith but in here (and watching through veetle) we are very happy to say you couldn't tell this was being done with such a little, low cost board.
Our
next step is to connect the COMM port to a Robot and interface with it along with implementing Vision Recognition.
The
COMM port is at 3.3V so you can connect it to:
**
Robobuilder Bluetooth socket (and use the RBC protocol the control the WHOLE robot, including querying sensors, playing motions, and doing servo bys ervo control)
**
Bioloid Zigbee port. The Zigbee ports on Bioloid are also 3.3V. Using the standard firmware you can only emulate the Remocon protocol (this is the same as saying you can send commands to drive the robot around but can't read back information).
The detail we're working on right now is understanding how to share the COMM port between the console and the Robot application (and also how to deal with the kernel Degud messages that pop up).
We'll post more news as we progress.
Regards
Pedro.
Hi all
As you probably know we've
hosted a LIVE streaming event for two 2 days using this board. (Coverage of the event here
http://robosavvy.com/forum/viewtopic.php?p=31429#31429 )
We've compiled the kernel following the steps we already posted above and installed the MJPEG streamer package.
Once you get the kernel compiled, you have access to a
whole repository of software, ready to install using opkg.
To prepare for the event,
using opkg we've installed:
- ssh server
- sftp server
- pureftpd
- The mjpeg streamer
So no custom compilation, toolchain or any hack was needed. We were thrilled to learn about this as this makes the board really easy to use, almost like any other Linux distro.
As for the streaming event, we've used a Microsoft LifeCam HD which was immediatelly recognized by the board.
The kernel was compiled with Debug messages being sent to the console so whenever there are important system changes you get a message which helps during this debug stage.
We managed to get the stream going at 424x240 at 5fps. CPU usage was about 90% during the whole process with the m-jpeg streamer taking the whole 90%.
The mjpeg streamer also includes an embedded webserver so the board was serving a website where you could watch the Live stream in different ways (still pictures, JAVA applet, JavaScript) and we were also uploading the stream to Veetle.
The UPs of our experience were:
- The image/stream was actualy very decent (you don't get the HD details) but both website access and image fluidity were at an excellent level (I couldn't even tell we were doing only 5fps)
- Setting up the whole system was a breeze with opkg giving us access to a wide range of software packages.
For example, we needed to modify a few webpages and we did so over SFTP .
- Reponsiveness of the system was very good. Even though the mjpeg process was taking 90% of the CPU, the remaining CPU time was enough to serve the webpages from the board.
- Console Access to the board is very easy with the built in COMM port and the supplied USB to TTL serial cable.
The DOWNs:
- From time to time the the mjpeg streamer aborted with a segmentation fault. We were unable to track down the reason but it seemed to be very random.
Despite the SIGSEGVs from the mjpeg streamer, the board never crashed nor did we experience any kernel panic or loss of functionality. Whenever the mjpeg died, we went in through SSH and restarted it.
I don't how the experience was for people outside our office as we have limited upload bandwith but in here (and watching through veetle) we are very happy to say you couldn't tell this was being done with such a little, low cost board.
Our
next step is to connect the COMM port to a Robot and interface with it along with implementing Vision Recognition.
The
COMM port is at 3.3V so you can connect it to:
**
Robobuilder Bluetooth socket (and use the RBC protocol the control the WHOLE robot, including querying sensors, playing motions, and doing servo bys ervo control)
**
Bioloid Zigbee port. The Zigbee ports on Bioloid are also 3.3V. Using the standard firmware you can only emulate the Remocon protocol (this is the same as saying you can send commands to drive the robot around but can't read back information).
The detail we're working on right now is understanding how to share the COMM port between the console and the Robot application (and also how to deal with the kernel Degud messages that pop up).
We'll post more news as we progress.
Regards
Pedro.