Managing Service Instances
See below for information about managing Data Flow service instances using the Cloud Foundry Command Line Interface tool (cf CLI). You can also manage Data Flow service instances using Pivotal Cloud Foundry® Apps Manager.
Creating an Instance
Important: If you are using the Redis for PCF product for the Spring Cloud Data Flow analytics store, you cannot create more Data Flow service instances than the setting of the Redis product’s Service Instance Limit (the default Service Instance Limit for Redis for PCF is 5). See the Shared-VM Plan section of the Installing and Upgrading Redis for PCF topic in the Redis for PCF documentation for information about configuring this limit.
Begin by targeting the correct org and space.
$ cf target -o myorg -s development api endpoint: https://api.system.wise.com api version: 2.75.0 user: user org: myorg space: development
You can view plan details for the Data Flow product using cf marketplace -s
.
$ cf marketplace Getting services from marketplace in org myorg / space development as user... OK service plans description p-dataflow standard Deploys Spring Cloud Data Flow servers to orchestrate data pipelines p-dataflow-mysql proxy Proxies to the Spring Cloud Data Flow MySQL service instance p-dataflow-rabbitmq proxy Proxies to the Spring Cloud Data Flow RabbitMQ service instance p-dataflow-redis proxy Proxies to the Spring Cloud Data Flow Redis service instance TIP: Use 'cf marketplace -s SERVICE' to view descriptions of individual plans of a given service. $ cf marketplace -s p-dataflow Getting service plan information for service p-dataflow as user... OK service plan description free or paid standard Standard Plan free
Each Data Flow service instance uses three dependent data services. Defaults for these services can be configured in the tile settings, and these defaults can be overridden for each individual service instance at create time.
Note: The service offerings with the plan proxy
are proxy services used by Spring Cloud Data Flow for PCF service instances. The Spring Cloud Data Flow service broker creates and deletes instances of these services automatically along with each Spring Cloud Data Flow service instance. Do not manually create or delete instances of these services.
General parameters used to configure dependent data services for a Data Flow service instance are listed below.
Parameter | Function |
---|---|
relational-data-service.name |
The name of the service to use for a relational database that stores Spring Cloud Data Flow metadata and task history. |
relational-data-service.plan |
The name of the service plan to use for the relational database service. |
messaging-data-service.name |
The name of the service to use for a RabbitMQ or Kafka server that facilitates event messaging. |
messaging-data-service.plan |
The name of the service plan to use for the RabbitMQ or Kafka service. |
analytics-data-service.name |
The name of the service to use for a Redis server that stores analytics. |
analytics-data-service.plan |
The name of the service plan to use for the Redis server. |
skipper-relational.name |
The name of the service to use for a relational database used by the Skipper application. |
skipper-relational.plan |
The name of the service plan to use for a relational database used by the Skipper application. |
Each Data Flow service instance can optionally specify Maven configuration properties. For the complete list of properties that can be specified, see the “Maven” section in the OSS Spring Cloud Data Flow documentation.
Maven configuration properties can be set for each Data Flow service instance using parameters given to cf create-service
. To set the maven.remote-repositories.repo1.url
property, you might use a command such as the following:
$ cf create-service p-dataflow standard data-flow -c '{"maven.remote-repositories.repo1.url": "https://repo.spring.io/libs-snapshot"}'
Create the service instance using cf create-service
. To create a Data Flow service that uses a Redis Cloud service available from your PCF marketplace and sets the Maven maven.remote-repositories.repo1.url
property to https://repo.spring.io/release
, you might run:
$ cf create-service p-dataflow standard data-flow -c '{ "analytics-data-service": { "name": "rediscloud", "plan": "30mb" }, "maven.remote-repositories.repo1.url": "https://repo.spring.io/libs-snapshot" }' Creating service instance data-flow in org myorg / space development as user... OK Create in progress. Use 'cf services' or 'cf service data-flow' to check operation status.
As the command output suggests, you can use the cf services
or cf service
commands to check the status of the service instance. When the service instance is ready, the cf service
command will give a status of create succeeded
:
$ cf service data-flow Service instance: data-flow Service: p-dataflow Bound apps: Tags: Plan: standard Description: Deploys Spring Cloud Data Flow servers to orchestrate data pipelines Documentation url: http://cloud.spring.io/spring-cloud-dataflow/ Dashboard: https://p-dataflow.apps.wise.com/instances/f09e5c77-e526-4f49-86d6-721c6b8e2fd9/dashboard Last Operation Status: create succeeded Message: Created Started: 2017-07-20T18:24:14Z Updated: 2017-07-20T18:26:17Z
Deleting an Instance
Deleting a Data Flow service instance will result in deletion of all of its dependent service instances.
Begin by targeting the correct org and space.
$ cf target -o myorg -s development api endpoint: https://api.system.wise.com api version: 2.75.0 user: user org: myorg space: development
You can view all service instances in the space using cf services
.
$ cf services Getting services in org myorg / space development as user... OK name service plan bound apps last operation data-flow p-dataflow standard create succeeded mysql-b3e76c87-c5ae-47e4-a83c-5fabf2fc4f11 p-dataflow-mysql proxy create succeeded rabbitmq-b3e76c87-c5ae-47e4-a83c-5fabf2fc4f11 p-dataflow-rabbitmq proxy create succeeded redis-b3e76c87-c5ae-47e4-a83c-5fabf2fc4f11 p-dataflow-redis proxy create succeeded
Delete the Data Flow service instance using cf delete-service
. When prompted, enter y
to confirm the deletion.
$ cf delete-service data-flow Really delete the service data-flow?>y Deleting service data-flow in org myorg / space development as user... OK Delete in progress. Use 'cf services' or 'cf service data-flow' to check operation status.
The dependent service instances for the Data Flow server service instance are deleted first, and then the Data Flow server service instance itself is deleted.
As the output from the cf delete-service
command suggests, you can use the cf services
or cf service
commands to check the status of the service instance. When the Data Flow service instance and its dependent service instances have been deleted, the cf services
command will no longer list the service instance:
$ cf services Getting services in org bklein / space dev as bklein... OK No services found