...
Swarm Configurator is a tool that determines the hardware specifications needed for the various components in the Swarm cluster. The customer Customer input is required with respect to cluster specifications, therefore , some typical data is needs to be collected through DataCore Cloud UI which includes, including:
Storage Characteristics
Data Protection requirementsRequirements
Cluster Replication across clustersErasure Coding for clustersconfiguration
Protection Scheme configuration (for example, Erasure Coding scheme)
Client Characteristics
...
Customer Inputs
...
There are two types of data protection methods available in Swarm Configurator; you can apply one protection method at a time so choose an option accordingly.
Replication
No Number of Replicas – Capacity is based on replicas; therefore, it is recommended to use less number of replicas (i.e., maximum 2-3 replicas). Having more replicas of a cluster requires more memory which results in less I/O capacity and makes the data access slower.
Erasure Coding
Erasure Data – The number of data blocks to store the fragmented data.
Erasure Parity – A calculated value to restore data from other drives. It is added to the end of each data block to verify the number of bits available in the block is odd or even.
Segment Size (MB) – The size of each block in MegaBytes.
Client Characteristics
...
No Number of concurrent clients – The number of clients concurrently connected to the storage for accessing data.
Write throughput per client Mbps – The rate at which the client writes or readsin megabits per second.
Hardware Components
The inputs from the customer are optional for hardware components. If not provided, Swarm Configurator will calculate the required hardware components based on other inputs such as Storage Characteristics, Data Protection Requirements, and Client characteristics, and represent them into three different categories:
Storage Nodes
Elastic Search
Elasticsearch
Gateway
Outcome
Results
...
Data Protection
Replication
No Number of Replicas – It is recommended to make maximum two or three replicas for faster I/O capacity and less memory load.
Erasure Coding
Erasure Data – The number of data blocks to store the fragmented data.
Erasure Parity – An encoded value for each data block to ensure that the number of bits assigned to each data block is odd or even. The parity is added at the end of each data block.
...
The input is collected from the customer for the following hardware components:
...
Hard drive size in Terra Bytes terabytes (TB).
Number of hard drives required for each node
Network speed in Gbps
Number of network ports for each node
CPU cores required for each node (e.g., 32, 64, etc.)
Total RAM required for each node in GigaBytes (GB)
...