Human Generated Data Analytics: Natural language is a basic tool for humans. It has allowed us the exchange of information between persons, and even to develop new ways of thinking. However, whether we are talking and using our voices, or writing and using text, language can be vague or ambiguous. So far, automated tools have extreme difficulty reading and comprehensing the information transmitted through language, which is essential to humans. However, the information generated and disseminated through media and new communication networks is unmanageable if data processing is done manually. Automatic tools that can “understand” what is said are indispensable for the new markets. We develop advanced tools that can perform those vital tasks (sort information, recommend, respond, etc.). In order to develop the mentioned tools and solutions, we apply new technologies able to handle and process large volumes of information (Big data), both real-time and batch; added to AI techniques and machine learning processes able to learn and adapt themselves to complex and diverse information volumes. Gradiant has extensive experience in processing textual information from social media, whose implementation is key as a competitive element on the commercial sphere: sentiment analysis, customer segmentation, recommenders, online reputation, etc.

Machine Generated Data Analytics: The amount of information currently stored, related to different processes (logistics processes, industrial processes, etc.), systems, services (sales, connections between users, power consumption, etc.) or data traffic (router logs, etc.) is not only huge and unwieldy manually, but also increases steadily with the popularizing of technologies like the Internet of Things (IoT). Analysing of that huge amounts of data can provide valuable information about those processes: It can prevent problems through detecting abnormal results or measures (without previous defining abnormal measures or results); or determine which events are related in more complex processes, facilitating management through prediction, while knowing that an event will trigger another with a certain probability. All this information enables simulations also to predict, for instance, what resources will be needed for a particular process, so the manager can optimize the resources automatically, while proactively anticipates future events. We develop tools and technologies able to do real-time processing of large volumes of information; and also algorithms able to learn independently from the information they receive, regardless of the sources needed, and the reaction of users and operators (through Machine Learning and Artificial Intelligence techniques). We can also find new ways for displaying relevant information in a clear, effective and attractive manner, so we can enable users to distinguish quickly what really matters in the midst of an ocean of information.


Research Lines:

  • eLearning
  • Human-Generated Data Analytics
  • Machine-Generated Data Analytics
  • Video Analytics
  • Health Data Analysis