Skip to content

Implement integration with hdfs #26

Open
@adwk67

Description

@adwk67

As a user of the spark-k8s-operator I want to be able to upload job dependencies to a hdfs path

  • include namenode in CRD, making this conditional
  • include necessary jars in the job image, so that the are available for each executor

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions