HOP Ubiquitous tell us about their aspirations for Data Pitch and beyond.

The impacts of climate change are some of the biggest global challenges we are currently facing. From the hyper-local, such as air pollution in street canyons, and the local, such as business loss due to flooding, to the national, such as crop failure due to adverse weather conditions, our wellbeing and sustainability are greatly affected. Yet with these challenges come opportunities for businesses to create social and economic value.

Spanish startup HOP Ubiquitous build environmental monitoring solutions for smart cities. We spoke to CEO Antonio Jara about their plans for Data Pitch.

What do you hope to achieve on the Data Pitch accelerator?

We hope to validate the data monitored by our Internet of Things (IoT) devices and third-party datasets by developing an ad hoc Machine Learning (ML) and Artificial Intelligence (AI) model. This will allow us to take full advantage of our existing resources, and create a revenue model that allows us to monetise our data as a new revenue stream.

Consequently, we expect to move from a company focused on just selling IoT devices, to one that can also provide data-driven solutions.

What shared data will you work with and how will you use it?

We’re working with the MET Office as part of the Weather and Climate Change Challenge (creating social and economic value by reducing the impact of climate change). As the MET Office is our Data Provider, we’re carrying out our Data Pitch pilot in London, and working with national and regional pollen data, weather and air quality real-time data, as well as weather forecasts. We will also be using data provided by our own IoT devices deployed in different cities.

We plan to use it as training data, with specific features, for developing a new ML model focused on weather and air quality. We’re building a solution to help cities make informed decisions that promote sustainable urban development, and minimise negative environmental impacts, such as by reducing emissions.

Why do you think it is important for startups to work with large scale data providers?

As is often the case for startups, it would be impossible for us to gather the huge amount of data needed to develop an accurate ML and AI model on our own.

Even though we have our own air quality monitoring network, the sensors we have deployed so far are allocated for pilots and private organisations, and how we use the data is restricted. It is therefore essential for us to work with large-scale data providers to get the quantity of accurate and reliable datasets that we need to build our model.

What’s the best thing about working with data?

The best thing is that we will be able to offer a reliable, data-driven IoT solution that is based on evidence. It allows us to build ‘what-if scenarios’ that are much closer to reality than they otherwise would be. This will ensure that the final users will be able to factor in both past and predicted events in a context-aware environment when making urban planning decisions, which reduces the risk of setbacks or unexpected incidents.

If you could change one thing about the data ecosystem what would it be?

We would like to see greater awareness of the importance of investing in accurate and appropriate datasets to ensure high-quality across different providers. Data quality is crucial, which is why we think it’s so important to support initiatives like the IEEE Standards Association’s P2510 – Standard for Establishing Quality of Data Sensor Parameters in the Internet of Things Environment.