Science Park 140, Amsterdam (Netherlands)
The EOSC-hub Digital Industry Hub is a non-profit multi-dimensional entity that allows research e-Infrastructures to support business organisations to stimulate the innovation potential of research infrastructures, as well as helping SMEs, start-ups and other innovative actors to tap into the academic world both in accessing knowledge as well as technical services. The final goal is to create a one stop shop that brings IT services, research data, and expertise into a single place to support innovation in industry.
The value proposition of the EOSC-hub DIH can be summarized in five key areas:
Brokerage and innovation
In this regard, the EOSC-hub DIH aims to:
- Facilitate effective collaboration within local and international networks.
- Promote “open innovation” as the concept used to leverage resources available in the e-Infrastructure network and strategically manage business innovation processes.
- Generate innovation ecosystems around e-Infrastructures.
- Create a wider community adding researchers, web-entrepreneurs, startups, SMEs, investors and corporates from different sectors and geographies.
- Expose startups/SMEs to new markets, cultures and business opportunities.
Provide the means for business incubation of innovative ideas
- Activities in this space include, but not limited to:
- Identifying and refining innovative ideas and facilitating the start-up of operational groups.
- Providing the necessary expertise and infrastructure (data, hardware, software, platforms).
- Creating conditions for showcase of benefits of new services and products within the e-infrastructure.
Access to public and private funding and facilitate market uptake
A dedicated unit aims to support both pilots and Competence Centres (CCs) to build-up successful market take-up and commercial boost strategy, through tailor-made coaching, market insights and network with investors/corporate. A business oriented coaching team will be established with the mission to accelerate pilots and CCs market uptake and exploitation of main achievements/ results.
Creating opportunities for industry to obtain resources (i.e. products, services, data, platforms and testing facilities) to establish new business
Gaining new competencies and skills within research and academic spaces for supporting spin-outs.
Fostering the re-use of open e-infrastructures ecosystems for innovation.
Improving exploitation of thematic/community-driven services among e-infrastructures.
service specific tutorials, business coaching webinars and formal certification in service management according to FitSM
Link to national or regional initiatives for digitising industry
The activities of the hub are well aligned with the Dutch national initative for digitising industry, Smart Industry - Dutch Industry fit for the Future.
The following organisations are a few where contacts have already been established, but our intended collaboration will obviously not be limited to these.
- DIATOMIC (SAE project) grant agreement No. 761809
- European Data Incubator (EDI)
- H2020 project EGI-Engage
Market and Services
- Agriculture, hunting and forestry
- Other community, social and personal service activities (media, entertainment, etc.)
- Other Manufacturing
- TRL2 - Technology concept and/or application formulated
- TRL3 - Analytical and experimental critical function and/or characteristic proof of concept
- TRL4 - Component and/or breadboard validation in laboratory environment
- TRL5 - Component and/or breadboard validation in relevant environment
- TRL6 - System/subsystem model or prototype demonstration in a relevant environment
- TRL7 - System prototype demonstration in an operational environment
- TRL8 - Actual system completed and qualified through test and demonstration
- Ecosystem building, scouting, brokerage, networking
- Visioning and Strategy Development for Businesses
- Collaborative Researchs
- Concept validation and prototyping
- Testing and validation
- Pre-competitive series production
- Commercial infrastructure
- Incubator/accelerator support
- Access to Funding and Investor Readiness Services
- Education and skills development
Client profile: TEXA, an Italian SME founded in 1992, designs, produces and sells diagnostic instruments for cars, motor bikes, and other vehicles. A serious challenge of this industry is that vehicle manufacturers generally have limited knowledge of a vehicle’s life once it leaves them. A service that can predict failures, mechanical problems or damage at the component level, and offer detailed information on these components, would be extremely valuable, saving manufacturers and fleet managers time and money. This service would gather and analyse data from TEXA’s sensors, which
could be used to redesign parts and modify maintenance schedules. This type of analysis requires significant computing power.
Client needs and provided solution: CINECA, with the support of T2I, an Italian research organisation that Helps companies through the design, development and testing of new products and services, developed for TEXA four Data Analytics prototype services.
These are based on information gathered from on TEXA’s On-Board Diagnostics (OBD) systems.
hese services cover areas that may affect the reliability, condition, or service needs of a vehicle - such as how it is driven, failure patterns,
and overall health of the vehicle. A Cloud HPC-powered workflow was developed. This was designed to easily integrate into TEXA’s existing automotive Data Analytics services. A service architecture has been defined that connects the existing TEXA infrastructure,
equipped to collect data from installed black boxes, to an HPC Cloud provider.
Benefit: TEXA estimated the Net Present Value of these new services to reach an overall value of €1.2 M over the first 3 years of availability. The ability to use an HPC-enabled workflow to analyse data from their diagnostics systems will enable better oversight of fleet vehicles and predict failures in time for these to be addressed
Training and Technology support
Client profile: Numeca International, a Belgian SME, in partnership with its Italian distributor, NSI, held a training workshop in the Milan World Join Center on 24 November 2016 to demonstrate the best techniques and most advanced technologies for Computational Fluid Dynamics (CFD) based on Numeca’s AutoMesh™ grid generation suite that leverages UberCloud containers (an EGI business partner) to offer fast, high fidelity meshes to hundreds of customers.
Client needs and provided solution: EGI cloud providers, CESNET (Czech Republic) and FZ Jülich (Germany) provided the underlying compute facilities allowing for the ~10 training participants to have hands-on experience using the software and applications, such as AutoGrid5™, the reference in the Turbomachinery market, HEXPRESS™, Numeca’s full hexa unstructured mesh generator, and HEXPRESS/Hybrid™ producing ultra-fast unstructured hex-dominant conformal meshes of complex geometries, starting from unclean CAD data, run on multiple processors, in order to gain experience with specific solvers that offer a versatile solution adapted to all kinds of industrial applications,
This training was considered a first trial for Numeca using the UberCloud containers on EGI’s cloud platform as a basis for a future partnership, which was a direct result from the joint webinar held by EGI and UberCloud on 20 October 2016 “How SMEs Can Use EGI’s Cloud for Computer-Aided Engineering (CAE)“.
Due to both the participants and trainers expressing a positive experience with the performance of the containers, discussions are underway for how EGI can continue to support R&D projects and future commercial trainings.
Cloud compute technology support
Client profile 1: Ecohydros, S.L.
It is an SME specialized in the research, monitoring and management of aquatic ecosystems and related resources. It includes a business line related with the development and commercialization of automatic and remote monitoring devices and infrastructures.
Since its foundation in 2003, Ecohydros has carried out more than 100 projects in the field of environmental consultancy and monitoring in Spain and abroad, and has been awarded with more than 10 R&D grants at regional, national and European levels, most of them related with the development and application of Cyberinfrastructures (CIS) applied to water issues.
Client 1 needs:
The proliferation of toxic microalgae, which cause red tides at sea and cyanobacterial HABs (Harmful Algal Blooms) in inland waters, poses serious and costly environmental and socioeconomic impacts.
These processes continuously affect services such as the provision of drinking water, including that from desalination plants, marine aquaculture, and recreational water activities. There is no estimate of the economic impact at a global level, but figures from some studies (i.e. from the literature review sum up in Economic Consequences of Harmful Algal Blooms: Literature Summary (2016) [R1] show great magnitude.
In Galicia (Spanish region) along, the miticulture business (mussel farming) generates 20.000 direct jobs and many more indirect ones, and an investment of more than 1 million euros / year is spent on official monitoring schemes. In China, there are more than 90 episodes of red tides documented per year, while in the USA the health cost associated with these episodes are estimated at 900 million dollars/year.
While many expenses may be difficult to quantify, there is little doubt that the economic impact of specific HAB events can be serious at local and regional levels.
These processes, caused in the last instance by a nutrient enrichment in the water body (eutrophication), have an ecosystemic character, that is to say, they are conditioned and influenced by many environmental aspects like meteorological, hydrodynamic and biogeochemical, some of which have an exogenous and diffuse character, sometimes with origins very distant from the mass of water affected, and others have an endogenous character in which all chains in the trophic network are involved. Therefore, they are considered environmental alterations of high complexity and impact that also have an increasing frequency, which is partly due to the effects of climate change. They are also generalized processes on a global scale.
For these reasons, its management is a challenge that involves various administrations and users of water and water resources, and it is currently far from efficient, as evidenced by large investments in wastewater treatment systems that have not achieved a proportional reduction of these episodes. On the other hand, the early warning systems linked to preventive and protective health actions have also not been developed in an affordable and efficient way.
New technologies can provide a qualitative leap in this area and offer a business space of great volume and future projection. The conceptual basis of this assertion is that in this type of complex, multifactorial and dynamic problems it is critical to obtain a large volume of data from different variables and key processes, moreover they have to be acquired with high frequency and with quasi-real time solutions.
The development of new monitoring systems is a fact. As regards climate and physical-chemical parameters, remote data acquisition is already a possibility through data logger systems coupled with multi-probe devices and meteorological stations. Overall, the contribution of molecular techniques and physical parameters in order to understand and predict the bloom dynamics has high relevance, since it allows predicting potential harmful situations at a very early stage.
In addition, an important and growing part of this information has a hyperspectral nature, or it deals with multispectral remote images or from holographic or fluorescence submersible microscopes, which is exponentially increasing the volume of data generated per unit of time and also the spatial extent of the measures.
But the limitations are not only in the generation of Big Data with advanced devices, but in the treatment, processing, analysis and management of that data, for which simulation and visualization tools that affect hundreds of variables and parameters are needed. Data flows intensify and increase their relationships in complexity, but it is also essential to calibrate the predictive models in the short and medium term in an at least semi-automatic way, performing sweeps of numerous parameters in multiple combinations, forcing the system to use high demand iterations. All this leads to an increase in the demand for computing beyond what a standard company or a standard computer center can provide.
From this, we can say that to address this challenge as a whole, the concept of cyberinfrastructure can represent the optimal approach and, at the same time, generate a business model that could provide managers and users with the key information at the right time, without having to understand the intermediate and internal processes that would force multiple actors and sectors, which would delay, make more expensive and in many cases cancel the possibility of accessing this type of technological solutions. But a limitation in computing capacity is reached to properly manage this type of applications. The cyberinfrastructure needed is not only the need of cloud resources, but a set of integrated services to allow, for example, the data ingestion from the instruments, data management (for curation, etc.), a set of different models and algorithms to get an added value and use interface for configuring/management.
Ecohydros has been working on this initiative for years and has proven the technical viability of this approach for an SME, having achieved a deployment longer than 5 years of a CIS in a reservoir.
But our internal computational resources are limited (as well as expertise in services architectural design and deployment) and it does not allow us to progress alone on the incorporation of repositories and processes related with aspects such as satellite imagery, 4D model calibration and running, and graphical outputs generation.
Provided solution that meeds the needs:
The main objective is to demonstrate the technical and economic advantages of applying the management of harmful algae blooms, exploiting Data Cloud Services (DCS) to support the key processes required (data processing, modelling, integration of images).
For this purpose, the following objectives will be achieved within the EOSC-DIH:
- Integration of all data processes into Data Cloud Services, including satellite images from MERIS and the forthcoming Sentinel-3 Ocean and Land Colour Instrument (OLCI), as soon as available.
- Integration of iterative optimization tools for 4D modelling of HABs.
- Demonstration of a HAB early detection and prediction event using the improved CIS with DCS in a cyanobacterial freshwater bloom (Water Agency as customer) and a red tide impacting on mussel production (producer association as customer).
DataHub, B2SHARE, Cloud Computing (on HTC and also HPC with limited parallelism), around 100 cores in mean (with 16GB/core), Human services, Commercialization support.
Client profile 2:
MOXOFF, an SME focused on advanced mathematical modelling and statistics applied to solve engineering problems in several industrial contexts.
Moxoff SpA is a spin-off company of the MOX laboratory (Modelling and Scientific Computing) of the Mathematics department of Politecnico di Milano. Moxoff was founded in 2010 and since then has continuously increased its number of employees, budget and business. Its customer portfolio is composed by more than 60 customers, spreading from well-known multinationals to brand-new startups.
YOTTACLE, specialized in the development of applications spanning from native mobile technology to full-stack web design, including experience in complex real-time architectures;
YOTTACLE SRL is an SME founded in 2014 by four partners with 25% share each. The partners have long-standing experience in SW development and management of complex projects. YOTTACLE is focused on the development of WEB platforms and mobile applications. Its expertise covers all the SW product lifecycle, from requirement management to design and development to testing and deployment. YOTTACLE has full-stack development expertise on complex WEB applications and in-depth knowledge of Android and iOS mobile operating systems.
MATHandSPORT, an SME founded in 2016 focused on the commercialization of smart video analysis tools in the sports field.
Clients 2 needs:
Recording videos is a very common practice for leisure and industrial world, thanks to the effectiveness as a media and the ease to buy devices on the market. Processing videos may be instead a time-consuming operation, especially if a huge amount of data is gathered and if manual interventions are needed to extract information. Even though software to support manual processing are available on the market, still the need of an automatic handling for KPIs extraction and smart work-flows is highly perceived by technicians and professionals dealing with video analysis and growing in the last years.
A notable example is sport: several videos are taken in the training sessions and manually inspected to analyse technical gestures to enhance the athletes’ performances. And sport is intrinsically competitive, thus it needs a continuous improvement of tools to speed up the processing procedures.
This is not the only one: in medical rehabilitation centers, a common approach to monitor the progress of the pathology in patients affected by reduced mobility is video analysis: it is exploited by doctors as non-invasive technique to extract KPIs supporting the patient care. And similar scenarios are found also in security, crash test and automotive sectors, all these experiencing similar needs for a smart automated tool for video processing.
Provided solution that meeds the needs:
Development of a mobile-friendly cloud platform for data-driven video analysis processing, to be configured as a SaaS. The goal is to answer the market needs for powerful and smart tool for video processing to extract KPIs in a data-driven and automated way: that is avoiding to manually inspect the huge number of video that may be required for a complete analysis, as big data are processed by advanced algorithm and methods (such as functional data analysis) automatically, to extract KPIs in standard reports minimizing the user intervention and maximizing the analyses efficacy.
The expected result is a pilot software to provide trainers, athletes and in general professionals who usually operate on the fieldwork with a tool to perform analyses through objective, standard and automated procedures on smart devices.
Services from EOSC-DIH:
- EUDAT B2DROP and B2SAFE for data sharing and management. Specific management rules, e.g. periodic quality checking or automatic image features extraction, might be developed on purpose;
• EGI FedCloud resources for data processing;
• EGI OneData for data access and distribution;
• EUDAT B2ACCESS for users’ identification and authentication.
- Human services, Commercialization support.
Client profile 3:
Bentley Systems International; type – LE; business sector – software development and distribution for infrastructures. Bentley Systems acquired Action Modulers, which was initially involved in the proposal, however there will be no impact or changes to the original proposed or agreed work plan and expected results.
Action Modulers – Consulting and Technology, Lda.; type - SME; business sector – consulting activities in Environmental Modelling & Risk Assessment and Protection & Safety.
ACTION Modulers is a Portuguese consulting company, mainly focused on numerical modelling, development of technological solutions, safety planning services, and training activities. Currently ACTION Modulers has 9 fixed employees and around 3 external consultants. The company has two main business areas: (i) Research and Development and (ii) Safety Planning.
Client 3 needs:
Seaports are vital gateways - 74% of goods entering or leaving Europe go by sea.
In their activities, they have to face important operational, tactical or strategic decisions. They manage piloting and navigation support, as well as the closing of port operations based on adverse marine weather. They should also be able to assess environmental aspects that result from regular port activity (e.g. air emissions from vessels), and to optimize throughput and service times, increasing port competitiveness. Finally, they must respond to marine pollution incidents or search and rescue (SAR) operations.
A holistic approach to these items is possible using Internet of Things, big data fusion, numerical models and data analytics. The integration of cloud computing in this equation is critical to increase skill in numerical forecasts (high resolution is highly resource demanding), performance, reliability and scalability, in a cost-effective fashion.
ACTION Seaport is an advanced mobile-friendly platform aiming to be accurate, computationally efficient, scalable and reliable, to be capable of serving simultaneously multiple Port Authorities - as well as coastguards and other maritime authorities – worldwide in decision support to improve safety, environmental and operational performance.
The information provided in the frontend is the combination of numerical forecasts, multi-source sensed data (e.g. buoys, AIS, satellite images), and data analytics. ACTION Seaport development has already emerged as a result of multiple R&D activities. But its implementation, integration and optimization over an adequate cloud-based backend architecture is now a necessity in order to take the application to the next level, namely in order to offer this solution in a scalable and high available environment, which will be required to take the application to a commercial level.
Provided solution that meets the needs:
ACTION Seaport pilot case will efficiently present reliable and accurate information in visually striking mobile-friendly maps, tailor-made SMS/email alerts, reports, and web services (OpenGIS® Web Map Service + REST API), duly supported by a cloud infrastructure to ensure fault-tolerance, scalability, performance and improved skill and resolution of the numerical modelling forecasts, using parallel-computing techniques.
ACTION Seaport pilot case will be initially applied in the Port of Lisbon - and later in other Ports worldwide, providing:
- Improved maritime situational awareness: data fusion from high resolution metocean models, drift forecasts, AIS data, webcams, satellite images, weather stations or buoys.
- Early-warning from adverse metocean conditions and daily reports from data analytics.
- Piloting and navigation support in maps and critical points.
- Smart environmental monitoring, integrating vessel data and estimated water and air parameters.
- Tactical support to marine pollution and SAR: on-demand drift model for oil, chemical, inert spills and floating objects.
- Automated assessment of port performance throughput and time services.
EOSC-DIH services used:
Cloud Compute, Archive and Oline storage, Data Transfer, Human services, Commercialization support.
The IBM research team in Zurich set up a project to develop a methodology for estimating the performance, power consumption and cost of exascale systems. The project is called Algorithms and Machines (A&M) and is part of DOME, a joint program with the Netherlands Institute for Radio Astronomy (ASTRON).
The main objective of this collaboration is to develop technologies to support the Square Kilometre Array (SKA), the world’s largest radio telescope currently being developed.
The A&M team set out to model an exascale computing system required by the SKA data processing pipeline. This system and the software running on it may allow an early and fast design-space exploration. To construct the software model, the A&M methodology used a platform-independent software analysis tool that measures software properties (such as available scalar and vector instruction mix, parallelism, memory access patterns and communication behaviour). As the software models are extracted at application run-time, they can only be collected on current systems which are orders of magnitude smaller than exascale. To predict the software models at exascale, the methodology used an extrapolation tool which employs advanced statistical techniques. Once extrapolated, the software model was then combined with a hardware model that captures the performance constraints and dependencies of a computer system. The mathematical formulas allow for a fast exploration of a large design-space of hardware processor- and network-related parameters.
To validate the analytical performance estimates, the A&M team required access to systems with different network topologies, (e.g., fat-tree and dragonfly). The team contacted EGI for support to obtain service access to such systems. EGI identified the Poznan Supercomputing and Network Center (PSNC) in Poland as a provider to offer such an environment and kicked started the collaboration.
PSNC offered access to Orzel / Eagle, a supercomputer with a performance of 1.4 PFlops computing power and a fat-tree network interconnect fabric. The A&M team then ran MPI applications of different problem sizes and number of MPI processes on the system, using configurations of two and three-level fat-tree topologies. The first validation results for the MPI-simple implementation of Graph 500 (a MPI benchmark for analytics workloads) showed that the analytical methodology can estimate the time performance with an accuracy of 82%, which is a very encouraging result. In the future, more MPI applications will be analysed to validate the A&M methodology.
Project (formalized end time)
Number of employees
- Horizon 2020
Number of customers annually
Type of customers
- Start-up companies
- SMEs (<250 employees)
- Research organisations
- Artificial Intelligence and cognitive systems
- Data mining, big data, database management
- Simulation and modelling
- Software as a service and service architectures
- Cloud computing
- ICT management, logistics and business systems