Data Life Cycle The data life cycle is pictorial defined as show below: As we see, in our current system, we capture/ Extract our data, then we store it and later we process for reporting and analytics. But in case of big data, the problem lies in storing and then processing it faster. Hence Hadoop takes this portion, where it stores the data in effective format (Hadoop distributed File System) and also process using its engine (Map Reduce Engine). Since Map Reduce engine or Hadoop engine need data on HDFS format to process, We have favorable tools available in market to do this operation. As an example, Scoop is a tool which converts RDBMS to HDFS. Likewise we have SAP BOD to convert sap system data to HDFS.
ReplyDeletehttps://intellimindz.com/sap-mm-training-in-chennai/
https://intellimindz.com/sap-sd-training-in-chennai/
https://intellimindz.com/sap-fico-training-in-chennai/
https://intellimindz.com/sap-ariba-training-in-chennai/
https://intellimindz.com/sap-abap-training-in-chennai/
https://intellimindz.com/sap-hr-training-in-chennai/
https://intellimindz.com/sap-hana-training-in-chennai/
https://intellimindz.com/sap-scm-training-in-chennai/
https://intellimindz.com/sap-bo-training-in-chennai/
https://intellimindz.com/sap-pp-training-in-chennai