Data velocity is the speed at which data is processed. This includes input such as processing of social media posts and output such as the processing required to produce a report or execute a process. The following are common levels of data velocity.
Data is processed close to the time that it is input or needed. This implies that you are processing data fast enough to respond to events as they occur. For example, a customer submits an application for a credit card and you approve it within a few seconds such that they immediately get an answer.Data is processed as events occur but not fast enough to respond immediately. For example, a customer submits a credit card application and you immediately begin a process of evaluating the application such that customers often get an answer within an hour.
Data is processed when computing resources are available such that processing falls behind events and catches up. For example, a social media site that runs algorithms to look for policy violations whenever computing power is free. Such algorithms may process posts that are hours or perhaps days old.Batch processing that is unique such that it might be executed once or on an irregular schedule. For example, a batch job that detects a certain type of social media policy violation might be executed once to clean up old posts.
Analytical ProcessingProcessing that is tied to decision making as opposed to business processes and events. This can be done in real-time or near real-time such that decision makers can explore data with analytical tools. It is also common to do analytical processing as batch jobs that produce regular reports.
This is the complete list of articles we have written about data science.
If you enjoyed this page, please consider bookmarking Simplicable.
© 2010-2023 Simplicable. All Rights Reserved. Reproduction of materials found on this site, in any form, without explicit permission is prohibited.
View credits & copyrights or citation information for this page.