理解stream-processing & batch-processing

  • Overview

    批流一体。

  • Stream Processing

    Stream processing is a computer programming paradigm, equivalent to dataflow programming, event stream processing, and reactive programming, that allows some applications to more easily exploit a limited form of parallel processing.

    Such applications can use multiple computational units, such as the floating point unit on a graphics processing unit or filed-programmable gate arrays (FPGAs), without explicitly managing allocation, synchronization, or communication among those units.

    The stream processing paradigm simplifies parallel software and hardware by restricting the parallel computation that can be performed. Given a sequence of data (a stream), a series of operations (kernel functions) is applied to each element in the stream.

    It is popularized by Apache Storm.

    Stream processing is the processing of data in motion, or in other words, computing on data directly as it is produced or received.

  • Batch processing

    Computerized batch processing is the running of “jobs that can run without end user interaction, or can be scheduled to run as resources permit”.

    Batch processing is teh processing of a large volume of data all at once.

  • References

  1. A Gentle Introduction to Stream Processing
  2. Ververica: What is Stream Processing?
  3. Real Time vs Batch Processing vs Stream Processing

猜你喜欢

转载自blog.csdn.net/The_Time_Runner/article/details/115295587