...
Swarm Configurator is a tool that determines total the hardware specifications based on requested clustersfor the various components in the cluster. The customer input is required with respect to cluster specifications therefore, some typical data is collected through DataCore Cloud UI which includes:
Storage Characteristics
Data Protection requirements
Replication across clusters
Erasure Coding for clusters
Client Characteristics
...
Once the above data is collected, these are estimated using a simple Machine Learning model or t-shirt size estimation.
...
Information
...
Customer Inputs
There are three types of inputs (Storage Characteristics, Data Protection, and Client Characteristics) required from the customer to determine hardware components. The outcome is displayed under the Results tab in the tabular format, which you can download in YAML if needed.
...
Number of logical objects in millions
Average size of the object in MB
Data Protection
There are two types of data protection methods available in Swarm Configurator; you can apply one protection method at a time so choose an option accordingly.
...
No of concurrent clients – The number of clients concurrently connected to the storage for accessing data.
Write throughput per client Mbps – The production rate of a service/product (in Mbps) to generate an output within a specified duration. In other words, it is a speed to process something.rate at which the client writes or reads.
Outcome
Hardware Components
The hardware configuration is based on customer inputs. Swarm Configurator calculates the required hardware components and represents them into three different categories:
Storage Nodes
Elastic Search
Gateway
Results
This menu provides complete results including data collected from the customer and the calculated configurations that are generated through Swarm Configurator. The result is presented displayed in the tabular format which you can export into a YAML file format.
...
What is Swarm Configurator – Reverse?
...
Replication
No of Replicas – It is recommended to make maximum two or three replicas for faster I/O capacity and less memory load.
Erasure Coding
Erasure Data – The number of data blocks to store the fragmented data.
Erasure Parity – An encoded value for each data block to ensure that the number of bits assigned to each data block is odd or even. The parity is added at the end of each data block.
Hardware Components
...