Skip to content

Avoid one ShinyApp occupy all Spark resources #7

@LIN-Yu-Ting

Description

@LIN-Yu-Ting

Current ShinyApp implementation shares the single SparkContext (sc) in one ShinyApp as sc is only destroyed at the end of calling sc.stop(). Therefore, it is not a good practice for resource utilization.

Proposed solution:

The lifecycle of sc should be only one action. Create sc only when spark is needed. After spark operation finished, then terminate directly the SparkContext using spark_disconnect(sc).

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions