The final Data Science Ireland Meetup of 2018 finished off in style as once again our world-class speakers delivered some top-quality presentations at Huckletree D2.
Once again this month’s meetup was in association with Alldus AI Workforce Solutions. We were delighted to have Alldus on board with us for the meetup and would like to say a big thank you to their team for their sponsorship and providing us with the delicious pizza.
First up, Volograms CEO Rafael Pagés gave us an insight into some of the great applications they are working on in Augmented Reality and Virtual Reality, as he discussed how AR, VR and volumetric video is changing the way we interact with digital media and the world around us. They believe that volumetric video (4D Video) will shape the way we communicate and express ourselves in the future.
Volograms create volumetric holograms using videos taken from different viewpoints and then you can experience Virtual and Augmented Reality applications. It works with many different camera setups from large professional setups to casually captured scenes and it works in many different environments from green-screen studios to uncontrolled outdoor scenes. You can even do it yourself from your very own mobile phone using your very own app.
Rafa showed us some of the cool applications Volograms have worked on in the past few months including doing Keepie uppies as a professional footballer, a volopresenter speaking to a room of students and even a reinterpretation of Samuel Beckett’s Play in Virtual Reality which won the NEM Art Award in NEM Summit in 2017. This is just the start for Volograms and I am sure we will see plenty more use cases of their Virtual Reality applications in the very near future.
Susan Sweeney from Dublin Business School continued the virtual reality theme and she introduced us to her alter ego #Suborg. Susan discussed a couple of projects she worked on with her team of nine students at DBS, including how they used a social constructivist approach to introduce a visual literacy element to a business curriculum and a really interesting project called #Cyborgart, which examines the affordances arts students gained by learning how to incorporate simple, cheap but effective sensors in their art.
In the first project, she discussed their approach to adding a visual literacy to the curriculum and how they went about it. Where data science came in was to develop the ability to read images and engage with abstract thinking. This ability plays a big part in the conceptualizing and pattern matching used in data and big data analyses. The use of data visualisation in business requires an understanding of visual literacy and pattern recognition and therefore develop a meaningful visual scape with interactive technology.
She also challenged her students to create a cyborg version of themselves and therefore, #Cyborgart came about. It was incredible to learn all these images were simply done using Powerpoint but they were really effective. Susan stressed that the importance of presenting visual literacy through storytelling. Don’t be prescriptive, allow it to evolve and allow people to be creative. Exploration and design is key to telling these stories and with the growing advancements of AR and VR, this is a great way to tell these stories.
We finished off the evening with Julie Connelly from the ADAPT Centre as she introduced our guests to her AI Award-nominated project AIMapIt. Together with the team at the ADAPT Centre at Trinity College Dublin led by Prof. Rozenn Dayhot, Julie discussed this project with Eir in which deep learning components have been trained to identify telecoms infrastructure in Google Streetview imagery.
In collaboration with Eir, Julie and the team set out to use state-of-the-art image processing to produce an up-to-date inventory of assets in relation to identifying telecommunications masts. Having considered drone capture and crowdsourcing, Google Streetview was the best possible image dataset for the task at hand. Using customised Fully Convolutional Neural Networks, they set on on their task but there were some challenges along the way, such as ESB poles and trees accounting for 90% of all false positives.
With two custom Deep Learning modules to perform computer vision tasks and Stochastic optimisation module to address GPS-tagging, they found a solution for telegraph pole detection. Having tracked 95,500km of roads with 20 poles per km, they discovered that the poles identified and geotagged has a 92% precision rate and the GPS accuracy came within 2m. This is just one of the many great projects that happen at the ADAPT Centre and you should definitely check them out if you get a chance.
Once again we would like to thank the guys at Huckletree D2 for having us at their amazing venue and also to our speakers Julie, Rafa and Susan for taking the time out of their busy schedule to give us an insight into their work. We hope they gave our guests plenty of takeaways that they can apply to their own data science and AI career.
Thank you to our guests who came along on Wednesday evening and all to those who attended all throughout the year and helping us grow to over 1000 members since launching in March. Have a great Christmas and we will see you all again in the New Year.