HPE Open NFV Interactive Exhibition Stand at Mobile World Congress 2016

In 2016,  we were at the Mobile World Congress Show in Barcelona, working with some really exciting interactive video technology. One of the most innovative and exciting projects we have worked on to date was unveiled at the show.

We were approached by the team at Ivory who had been tasked with designing an exhibition stand for Hewlett Packard Enterprise (HPE).  The brief was to showcase their Open Network Function Virtualisation (NFV) Partner Ecosystem. The concept was designed around this ecosystem of partner companies, each demonstrating their technology utilising the HPE platform. The 9 “LED ribbons” that formed the structure of the stand represented the partners. The booth and upper level representing the HPE Cloud.

In short, the stand was about huge networks. The concept was to visualise this, the data moving backwards and forwards and via on to LED screens that made up the structure we were to create interactive content to tie it altogether. Each of the nine “LED ribbons” that wrapped up, across and over the stand represented the nine NFV developers that were to be showing their products in the demo area. The top of the stand represents the HPE platform that makes it all possible. Oh, and they wanted it to be interactive to represent the very nature of the Open NFV Ecosystem! Our task was not only to create video content for the screens but to develop an interactive system With our new knowledge of networking we set to work on the R&D.

We get to play with a lot of professional equipment at Event Projection, we have a warehouse full of it. Unfortunately, we quickly came to the conclusion that we didn’t have, and more importantly couldn’t source what we needed for this project. Thankfully, our technical manager was confident that we could build something. After much research we decided that the most reliable form of sensor for the stand was ultrasonic. These are typically found on the back of a modern car in the reversing system, they’re the things that make the beeping noise to stop you hitting the lamppost. They’re cheap and detailed information meant that we could predict what they were going to do. We set off with the sensor research, trawling through spec sheets until we had found what we were looking for. The polar pattern (sensitivity range) was critical. We had decided to use two sensors per LED ribbon. Each pair would be able to triangulate the position, velocity and direction of a person passing. The outcome of this R&D stage was signed off after a successful “proof of concept” was delivered and work could really being.

The Sensors would not work on their own with out an interface. With a total of 18 sonar senses, we needed to build both the hardware and software to make it all talk. The interface is effectively a tool that take the 18 data streams from the sensors and outputs meaningful data for our computers to work with. With no off the shelf solutions, this was to be known as Steve’s Box of Tears… it wasn’t so easy to build! The rest of the system consisted of two data crunching computers and two graphics generating computers.

With any project that deals with video, content is king and without good content, the concept falls apart. Thankfully, with such a strong concept the ideas were plentiful. We needed to illustrate the movement of data between the user and the network. We liked the idea of rain, data falling with gravity effects down the LED ribbons; “Data Birds” flying around, mapped onto all 9 pillars and effects similar to those in The Matrix where a person entering the stand would cause “Data Ripples” in graphics flowing up and down the pillars. Many of these ideas were, as to be expected, thrown out by the client. We also learnt that for almost every different sequence, a different “graphics engine had to be written. At this time in the project, it was also decided that the screens were to be made from relatively low, 12mm pixel pitch LED screen tiles. These were to limit the content further; any curves in the continue would look jagged on the low resolution screens. It was decided that the screens were to be covered with diffusion fabric to soften the pixels too.

The technical design and build of the stand itself was crafted by Berlin based Minga Network and we arrived bang on where the build schedule put us. The stand itself was a technical marvel, made from an extruded aluminium frame that housed the LED panels to perfection. It proved to be too perfect on the show when visitors had to be stopped from leading on the beautifully disguised screens! Our build had been 75% completed back in London and we just had to wheel our rack or precious kit into the tech room and fit and wire up the sonar and cameras. The set-up was smooth and the show was a great success.

Interactive Video Technology - Mobile World Congress 2016

Interactive Video Technology - Mobile World Congress 2016

Interactive Video Technology - Mobile World Congress 2016

Interactive Video Technology - Mobile World Congress 2016

Interactive Video Technology - Mobile World Congress 2016

Interactive Video Technology - Mobile World Congress 2016

Share: