Great story on ITNews today on how Mining Giant Rio Tinto is using technology and big data to assist in mining exploration. Rio has massive scale data requirements and storing and analysing this data is not possibly without using big data methodologies. Use of cloud based storage infrastructure such as Google, Amazon and Azure is giving Rio the edge.
As an example, Rio’s 900 Haul Trucks generate almost 5TB of data each day providing the information to defer hundreds of thousands of dollars in maintenance costs.
Rio calls it the "Mine of the Future" program and in addition to cloud based storage Rio has a centre in Brisbane called the Processing Centre of Excellence (PEC), which uses big data analytics to process this data.
The company is also one of the biggest users of autonomous technology in the resources sector, including driver-less trucks and unmanned aerial systems.
Big data is a broad term for data sets so large or complex that traditional data processing applications are inadequate. Challenges include analysis, capture, data curation, search, sharing, storage, transfer, visualization, querying and information privacy.
The term often refers simply to the use of predictive analytics or certain other advanced methods to extract value from data, and seldom to a particular size of data set. Accuracy in big data may lead to more confident decision making, and better decisions can result in greater operational efficiency, cost reduction and reduced risk.
Analysis of data sets can find new correlations to spot business trends, prevent diseases, combat crime and so on. Scientists, business executives, practitioners of medicine, advertising and governments alike regularly meet difficulties with large data sets in areas including Internet search, finance and business informatics. Scientists encounter limitations in e-Science work, including meteorology, genomics, connectomics, complex physics simulations, biology and environmental research.