Some of the Big Data Europe team were in Luxembourg this week for the Information and Networking Days on Horizon 2020 Big Data Public-Private Partnership topics 2017. Admittedly, this was partly motivated by a desire to work towards continuation projects, but it was also a chance to promote the work we’ve been doing.
During the two days, it was made clear that anyone submitting proposals for these calls is going to need to include partners who own, or can access, ‘appropriately large and complex datasets’ that must be European.
And just what might ‘appropriately large and complex’ mean exactly?
EU Commission Project Officer, Stefano Bertolo, offered a definition in terms of progression. We’re now able to handle data volumes that were unmanageable only 3 – 5 years ago. The next round of projects should handle datasets that are very much on the edge of what can be tackled now, anticipating that those datasets will be commonplace by the time the project ends.
As well as appropriately large and complex data, your project is going to need a flexible, configurable infrastructure to process it. The infrastructure needs to be easy to maintain and it should preferably one that is free to use, is open source, and backed by a community of engineers using it for real world big data tasks. You’ll want proof that it can be used to process those appropriately large and complex datasets too.
The Big Data Integrator platform meets all those requirements of course – and is there for you to use, adapt, add to, configure as you wish.