Skip to main content

Limitations of Traditional Data Analysis

 Some limitations of traditional data analysis are:

Lack of data sources meaning that it relies solely on very particularly organised data such as ones through surveys, interviews, questionnaires or really any other form of data collection. Because of this it’s limited to give a complete picture of what the business world it truly like. 

Limited scope means that it’s most often limited in information and provides insight to a specific part of the business world. For example, a survey about customer experience at a business may say that the employee was rude or unkind but it doesn’t give an insight about how the customer may have behaved to receive this kind of treatment. 

Another part is that it’s time consuming to complete because if this it means the business can’t make decisions fast or efficiently. An example would be that if a survey or questionnaire took a long time to be taken and additionally to be sorted and analysed which can take weeks if not months to be finished. 

Some other examples are that it isn’t always accurate and can be very prone to many mistakes and things that aren’t true for example a questionnaire may not reflect everyone’s thoughts and feelings and only a very select few. 

A final examples would be that it can be very expensive and require particular resources to complete which makes it particular different for small businesses for example to complete and make difficult choices. 


https://fastercapital.com/topics/the-limitations-of-traditional-data-analysis-methods.html



Comments

Popular posts from this blog

Definition of big data

Big data is large and varied collections of data. That cannot be managed by traditional systems. It’s studied to reveal patterns, trends and associations. It was created by Roger Douglas in 2005. The data is so large and complex in its volume, velocity and variety that normal data management/storage systems can’t store, process or analyse due to the mass size of the dataset.

Traditional Statistics

 The traditional statistics of big data mainly revolves around structured data. It involves around statistical techniques which is things such as surveys, interviews, experiment data etc and visualisation's. But it also may include machine learning, data storage and additional techniques. Using a range of data sources such as social media, lot devices or sensors which is used for big data analytics.  https://www.geeksforgeeks.org/difference-between-traditional-data-and-big-data/#:~:text=Traditional%20data%20analysis%20methods%20typically,be%20stored%20and%20managed%20effectively. https://www.linkedin.com/pulse/whats-difference-between-big-data-analytics-traditional-statistical-ojodc#:~:text=Traditional%20statistical%20analysis%20primarily%20deals,surveys%2C%20experimental%20data%2C%20etc.

Characteristics of Big Data Analysis ps

 It’s often described using five main characteristics these being: Volume, value, variety, velocity and lastly veracity.  Volume being the size of the data that is needed to be analysed, value is from the eye of the business which leads to better operations, decisions, customer satisfaction and also being able to see what the business needs to make it succeed, variety which is the range of the data types, velocity is the rate at which companies get, store and manage their data such as on social media if someone asks a question how long it takes for the company to reply to this, and finally veracity which shows the actual accuracy behind the data and the information.   https://www.teradata.com/glossary/what-are-the-5-v-s-of-big-data#:~:text=Big%20data%20is%20a%20collection,variety%2C%20velocity%2C%20and%20veracity.