Installing and Configuring Altoros Spark for PCF

This topic describes how to install and configure Altoros Spark for PCF.

  1. Download the product file from Pivotal Network.

  2. Navigate to the Ops Manager Installation Dashboard and click Import a Product to upload the product file.

  3. Click Add next to the uploaded Altoros Spark for PCF tile in the Ops Manager Available Products view to add it to your staging area.

  4. Click the newly added Altoros Spark for PCF tile.

  5. Click Altoros Spark for PCF. Ops Man Config

  6. For Apache Spark VM Type, use the drop-down menu to select the type of VM to use for each Apache Spark node. The dropdown menu is automatically populated with the list of VMs configured for your cloud environment.

  7. For Apache Spark Disk Type, use the dropdown menu to select the type of disc to use for each Apache Spark node.

  8. For Apache Spark availability zone(s), select the checkboxes that correspond with the availability zones (AZ) where you plan to deploy Apache Spark. The AZ must have an associated service network.

  9. Click Save.

  10. Return to the Ops Manager Installation Dashboard and click Apply changes to install Altoros Spark for PCF tile.

  11. After the installation finishes, see Creating and Binding Apache Spark Service Instances for how to create and bind Apache Spark service instances.

Create a pull request or raise an issue on the source for this page in GitHub