Skip to main content
Lucidworks documentation home page
v4.2
Search anything
⌘K
Archive docs
LucidAcademy
Book a demo
Book a demo
Search...
Navigation
4.2 Spark API
Delete an aggregation job's output
SaaS Platform
Platform APIs
Self-hosted Fusion
Managed Fusion
Fusion APIs
Fusion Connectors
Policies
Apps
4.2 Apps API
Archive
4.2 Archive API
Blobs
4.2 Blobs API
Catalog
4.2 Catalog API
Collections
4.2 Collections API
Configurations
4.2 Configurations API
Connectors Datasources
4.2 Connectors Datasources API
Connectors History
4.2 Connectors History API
Connectors Jobs
4.2 Connectors Jobs API
Connectors Plugins
4.2 Connectors Plugins API
Connectors Status
4.2 Connectors Status API
Custom Rule
4.2 Custom Rules API
Experiments
4.2 Experiments API
Features
4.2 Features API
Groups
4.2 Groups API
History Id
4.2 History ID API
History
4.2 History API
Index Pipeline Templates
4.2 Index Pipelines Templates API
Index Pipelines
4.2 Index Pipelines API
Index Profiles CRUD
4.2 Index Profiles CRUD API
Index Profiles
4.2 Index Profiles API
Index Stages
4.2 Index Stages API
Introspect
4.2 Introspect API
Jobs
4.2 Jobs API
License
4.2 License API
Links
4.2 Links API
Messaging
4.2 Messaging API
Nodes
4.2 Nodes API
Objects
4.2 Objects API
Parser Templates
4.2 Parser Templates API
Parsers
4.2 Parsers API
Partitions
4.2 Partitions API
Query Pipeline Templates
4.2 Query Pipeline Templates API
Query Pipelines
4.2 Query Pipelines API
Query Profile Back Compatibility
4.2 Query Profiles Back Compatibility API
Query Profiles CRUD
4.2 Query Profiles CRUD API
Query Profiles
4.2 Query Profiles API
Query Rewrite
4.2 Query Rewrite API
Query Stages
4.2 Query Stages API
Recommend
4.2 Recommend API
Registration
4.2 Query Profiles API
Scheduler
4.2 Scheduler API
Schema
4.2 Schema API
Search Cluster
4.2 Search Cluster API
Search Logs
4.2 Search Logs API
Signals
4.2 Signals API
Solr
4.2 Solr API
Solr Admin
4.2 Solr Admin API
Solr Config
4.2 Solr Config API
Spark
4.2 Spark API
GET
List existing Spark job configurations
POST
Create a new Spark job configuration
GET
Retrieve a Spark job configuration
PUT
Update an existing Spark job configuration
DEL
Delete an existing job configuration
DEL
Stop the Spark driver process
GET
Get the location of all the current Spark drivers
GET
Health check of Spark cluster
GET
Get the shaded JAR for running Spark jobs
GET
List all running Spark jobs
POST
Start a Spark job using the provided configuration. A unique ID will be automatically assigned and returned to the caller. Returns current status of that job
deprecated
DEL
Stop all running Spark jobs
DEL
Delete an aggregation job's output
GET
Retrieve status of a running Spark job
POST
Start a Spark job using an existing job definition
DEL
Stop a running Spark job
GET
Retrieve status of a running Spark job
GET
Get last N log lines from launcher driver
GET
Get last N log lines from default driver
GET
Get last N log lines from launcher driver
GET
Get last N log lines from scripted driver
GET
Get all logs related to job
GET
Get last N log lines from Spark Master
GET
Get last N log lines from SQL log
GET
Get last N log lines from Spark Master
GET
Return a string containing the Spark master URL
GET
Get a list of Spark master processes in the cluster
GET
Get the status of all the available Spark masters
GET
Get job results of a specific type
GET
Get the schema for all the Spark job types
POST
Execute an arbitrary Scala script
GET
Get the status of the service
GET
Get a list of Spark worker processes in the cluster
Suggestions Datasources
4.2 Suggestions API
Swagger
4.2 Swagger API
Synonyms Editor
4.2 Synonyms API
System Autocomplete
4.2 System Autocomplete API
System
4.2 System API
Tasks
4.2 Tasks API
Taxonomy
4.2 Taxonomy API
Usage
4.2 Usage API
Webapps Admin
4.2 Webapps Admin API
Webapps Appkit
4.2 Webapps Appkit API
Webapps
4.2 Webapps API
Zookeeper
4.2 Zookeeper API
Spark
4.2 Spark API
Delete an aggregation job's output
Copy page
Copy page
DELETE
/
spark
/
jobs
/
{collection}
/
{id}
/
output
Path Parameters
collection
string
required
Collection name
id
string
required
Aggregation ID
Query Parameters
jobId
string
required
Job ID
Response
default
successful operation
Was this page helpful?
Yes
No
Stop all running Spark jobs
Retrieve status of a running Spark job
⌘I