Big Data: Relaible, Scalable & Distributed Computing

Big Data: Relaible, Scalable & Distributed Computing

Realiability

As businesses become aware that the Big Data trend is here to stay, publishers are looking for reliable support. There are many forms or reliability, all of which will have an effect on the overall reliability of the instrument and therefore the data collected. Reliability is an essential pre-requisite for validity. It is possible to have a reliable measure that is not valid, however a valid measure must also be reliable.

Scalability

scalability is the ability of a system, network, or process, to handle growing amount of work in a capable manner or its ability to be enlarged to accommodate that growth.[1] For example, it can refer to the capability of a system to increase total throughput under an increased load when resources (typically hardware) are added.

Distributed Computing

– Multiple architectures and use cases

– Focus today: using multiple servers, each working on part of job, each doing same task job, each doing same task

– Key Challenges:

  • Work distribution and orchestration
  • Error recovery
  • Scalability and management

Leave a Reply

Your email address will not be published. Required fields are marked *