Stream processing

Updated: 11/16/2019 by Computer Hope
Illustration: A stream of bits.

Stream processing is a model for processing an ongoing stream of data. It differs from traditional programming models, where a set of data is loaded from a disk into memory and then processed in arbitrary ways. In stream processing, a small number of predefined operations are performed progressively in parallel as data becomes available in the stream.

Stream processing is well-suited to DSP (digital signal processing), computer vision, digital video and image processing, and big data analysis. It enables a business to process, analyze, and draw conclusions from data as it's being collected in real-time.

Languages and platforms

The following programming languages, platforms, and services are designed for stream processing:

  • Amazon Kinesis — A stream processing platform provided by Amazon Web Services.
  • Azure Stream Analytics — Stream processing and real-time analytics on the Microsoft Azure platform.
  • BrookGPU — An early, influential stream processing language, developed and hosted at Stanford University.
  • CUDA — Compute Unified Device Architecture, a proprietary parallel computing platform and API (application programming interface) developed by NVIDIA.
  • Flink — A stream processing engine with a focus on event processing and state management, developed by Apache.
  • Google Cloud Dataflow — A fully-managed stream processing service available as part of the Google Cloud Platform.
  • Kafka — An open-source stream processing software platform developed by LinkedIn and later donated to the Apache Software Foundation.
  • RaftLib — An open-source stream processing library for C++, developed at the Supercomputing Lab of Washington University in St. Louis.
  • StreamIt — A programming language for authoring stream processing systems, created at MIT (Massachusetts Institute of Technology).

Analysis, Cloud, Computing, Programming terms