Batches that process large amounts of data often have stringent performance requirements.
While in many cases, individual performance such as tuning of SQL statements is aimed at improving performance, there are also a significant number of cases that extend to architectural changes.
proaxia offers a stream-based batch acceleration framework to solve these problems.
Stream-based data processing
Stream-based data processing is a technique that applies to RDBMS, similar to CEP (complex event processing), a method that processes a large amount of stream data in a unit, such as CEP.
It involves executing SQL statements, cuts out large amounts of data in units (chunks), processes business logic in parallel, and outputs the results to a table (BulkCopy).
- Only data to be processed is loaded into the memory, so even a large amount of data consumes less memory resources.
- The issued SQL statements are focused in the streams, hence, and it is easy to read manage the processing cost with a small number of statements and easy to identify tuning points.
- Each chunk is a meaningful units for business, business logics are stated naturally and maintainability is improved (speeding up tricks are not required).
The batch acceleration framework provided by proaxia is a framework for the above stream processing concept.
[Reference: Effect of update processing using BulkCopy]
|Execute SQL statement from VB.NET||38.10 s||37.87 s||37.96 s|
|Stored procedure (cursor loop)||11.79 s||17.81 s||17.38 s|
|Execute BulkCopy from VB.NET|
(Updating / deleting at once using BulkCopy data)
|0.55 s||0.74 s||1.36 s|