Skip to main content

Technological Requirements of Big Data

 The big data process requires clustered systems which provide required computing power. Some steps that businesses use is that it defines business objectives so include stockholders from the start, data scientists employed etc, identify data sources so figure out what format you want your data to reside with and begin the process of sorting the data and analysing it, identify and prioritise meaning to keep, hold and use information as it is needed and agreed upon and nothing more and lastly to formulate a big data roadmap so seeing any missing pieces and gaps surrounding your data. 

https://www.techtarget.com/searchdatamanagement/definition/big-data

Comments

Popular posts from this blog

Definition of big data

Big data is large and varied collections of data. That cannot be managed by traditional systems. It’s studied to reveal patterns, trends and associations. It was created by Roger Douglas in 2005. The data is so large and complex in its volume, velocity and variety that normal data management/storage systems can’t store, process or analyse due to the mass size of the dataset.

Limitations of Predictive Analysis

One of the limitations is data quality, the predictive models rely entirely on large and accurate datasets to make realistic predictions. So if the data is inaccurate, biased or incomplete then the model will be flawed.  The next is overfitting which happens when a model is trained on a particular dataset and becomes too complicated makes it difficult to generalise new data. Because of this it can make inaccurate predictions and poor performance.  Next is changing conditions, predictive analytics models are made to predict future outcomes based on previous data. However, the future is very uncertain and because of this the conditions can change quickly so it can be very difficult to predict accurately.         And lastly ethical concerns can make it difficult to predict analysis because of ethical concerns such as bias or privacy. Predictive models can have an already existing bias and discrimination. Also predictive analytics can have privacy concerns...

Characteristics of Big Data Analysis ps

 It’s often described using five main characteristics these being: Volume, value, variety, velocity and lastly veracity.  Volume being the size of the data that is needed to be analysed, value is from the eye of the business which leads to better operations, decisions, customer satisfaction and also being able to see what the business needs to make it succeed, variety which is the range of the data types, velocity is the rate at which companies get, store and manage their data such as on social media if someone asks a question how long it takes for the company to reply to this, and finally veracity which shows the actual accuracy behind the data and the information.   https://www.teradata.com/glossary/what-are-the-5-v-s-of-big-data#:~:text=Big%20data%20is%20a%20collection,variety%2C%20velocity%2C%20and%20veracity.