Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Did formatting for 'Push to cloud' image

...

Usage

...

  • Manifest File - Contains information such as total object count & size, endpoint, target bucket name, access key, and secret key of focus dataset copied.

  • Dictionary File(s) - Contains the list of all focus datasets copied if the focus dataset is defined from the Swarm side. The dictionary file displays the name of each object along with the size in bytes. Note that these are not generated when an entire bucket or folder is being copied to the cloud.

  • Log File - Provides a run summary or error message and contains the current status of each object of the focus dataset copied, along with details from the final check. The log file is generated after the objects are queued up to copy and, is refreshed within every two minutes. There are four potential statuses for each object:

    • Pending - An initial state where the object is queued up for the copy.

    • Failed - When an object is failed to copy to the target storage. Reasons include the target endpoint is not accessible, or any issue with an individual object, such as too large object name for S3. See the Gateway server log for failure details if not specified within the log file.

    • Copied - When an object is successfully copied.

    • Skipped - When an object is skipped. Reasons include the object already exists at the destination, the object does not exist at the source, or the object is marked for deletion and cannot be copied.

  • Result Summary - Provided with Content Portal 7.7 and later. The result summary appears in JSON format after job completion, with a link to the log file.

...

  1. Navigate to the Swarm UI bucket or collection to copy.

  2. Click Actions (three gears icon) and select either “Push to cloud” or “Pull from cloud” (formerly Copy to S3). Select S3 or Azure depending on the remote endpoint.

    Image Added


    A modal presents a form, with required fields marked with asterisks (*) as shown in the example below:

    Image RemovedImage RemovedImage Added

    • Job Name - Provide a unique name for the job. The manifest and log files are overwritten if a job name is reused from a previous run.

    • Local Path - Provide the bucket name and any optional subfolders. Objects are copied directly to the bucket if the field is left blank. This is an option when pulling from the cloud.

    • Object Selection - An option when pulling from the cloud.

      • All in the remote path - Select this option to copy all the objects/files from the remote location.

      • Only objects matching the current collection/bucket list - Select this option as a filter to repatriate and update only the objects/files that already exist in the Swarm destination to the remote version.

    • S3

      • Endpoint - A remote service endpoint.

        • For AWS S3 endpoints - The format is shown in the screenshot above.

        • For Swarm endpoints - The value needs to be in the following format with HTTP or HTTPS as needed:
          https://{DOMAIN}:{S3_PORT}

      • Region - The S3 region to use. Some S3 providers may not require region.

      • Remote Bucket - Enter the remote bucket name.

      • Remote Folder - An optional folder path within the remote bucket.

      • Access Key - An access key for the remote bucket and must be generated within the remote cloud storage service.

      • Secret Key - An S3 secret key. It is generated with the access key.

    • Azure Blob

      • Remote Container - Enter the Azure container name.

      • Remote Folder - An optional folder path within the remote container.

      • Authentication method can be Account and Key, or SAS URL.

        • The SAS URL needs permission to list in addition to other relevant file permissions.

  3. Click Begin Copy. This button is enabled once all required text fields are filled.

...