Throughout history, humanity has collected data by observing natural phenomena. Collecting data from observations, analyzing it, and deriving higher-level insights are natural inclinations of human beings and essential processes for developing past, present, and future civilizations. The exponential growth in smart sensors and rapid progress in 5G networks create a world awash with data streams. However, high programming complexity is a critical barrier to building performant multi-sensor distributed stream processing applications.
We propose DataX, a novel platform that improves programmer productivity by enabling easy exchange, transformations, and fusion of data streams. DataX abstraction simplifies the application’s specification and exposes parallelism and dependencies among the application functions (microservices). DataX runtime automatically sets up appropriate data communication mechanisms and enables effortless reuse of microservices and data streams across applications. DataX makes it easy to write, deploy, and reliably operate distributed applications at scale. Synthesizing these capabilities into a single platform is more transformative than any available stream processing system.
I’ve been working in the “computing” environment since 2003. I started as a hobbyist and I contributed to many Open Source projects liken CRUX, a GNU/Linux distributions, the Linux kernel, Fedora Linux. The “computing” is the main topic of my studies: I got a BS degree in Computer Science in 2010, I’m actively working as a researcher in Computer Science. My main research interests are related to the operating systems, the high performance computing, the cloud computing. In all of those years of hobby, study and research I built a very good knowledge about the internals of Linux, starting from the kernel to the system administration and to the software development.