Handling Big Data in a Datawarehouse
I am a learner in Big data concepts. Based on my understanding Big Data is critical in handling unstructured data and high volume.When we look at the big data architecture for a data warehouse (DW) the data from source is extracted through the Hadoop (HDFS and MapReduce) and the relevant unstructured information is converted to a valid business information and finally data is injected into the DW or DataMart through ETL processing (along with the existing structured data processing).
However, i would like to know what are the new techniques/new dimensional model or storage requirements required at DW for an organization (due to the Big Data) as most of the tutorials/resources I try to learn only talks about Hadoop at source but not at target. How does the introduction of Big Data impact the predefined reports/ad-hoc analysis of an organization due to this high volume of data
Appreciate your response