- Start
- Incremental learning for large scale data stream analytics in a complex environment
Incremental learning for large scale data stream analytics in a complex environment
Angebote / Angebote:
Nowadays, in the era of the Internet of Things (IoT), data are generated from devices/sensors in the form of text, images, videos, etc. Data from sensors arrive continuously from multiple sources, different environments as data streams [1]. As a result, vast volumes of data can be generated in the cloud over time. Furthermore, data streams are also characterized by non-stationary environments that come from real-world applications [2]. Due to the high business demands, these enormous volumes of data streams need to be learned immediately as they arrive for decision-making purposes [3]. While large-scale data streams have a high potential to improve effective decision making, learning from this data is challenging. Based on [4], a general problem large-scale/big data of big data includes 5V (Variety, Velocity, Volume, Value, and Veracity) characteristics. In machine learning literature, of these five characteristics, there are at least two main issues in learning from data streams which trigger further investigation: huge volumes and velocity. Another issue of data streams is the nonstationary characteristics of data streams [5]. The vast volumes and velocity characteristics causes the generation of big data due to the high speed of arrival data streams. In contrast, the non-stationary characteristic is related to the changing of data distribution over time. These challenges provide excellent opportunities for many research directions.
Folgt in ca. 15 Arbeitstagen