Warm tip: This article is reproduced from serverfault.com, please click

Is there way to test my Pyspark notebooks on Databricks

发布于 2020-11-25 13:43:03

I'm devving on databricks and would like to test my utitlity functions defined inside notebooks, what's the best way to do this? thank you in advance for your assistance👊🏿

Questioner
Johan Khanye
Viewed
0
CHEEKATLAPRADEEP-MSFT 2020-12-02 17:04:01

You can use databricks-connect: Databricks Connect allows you to connect your favorite IDE (IntelliJ, Eclipse, PyCharm, RStudio, Visual Studio), notebook server (Zeppelin, Jupyter), and other custom applications to Databricks clusters and run Apache Spark code.

With databricks-connect you can connect your favorite IDE to your Databricks cluster. This means that you can now link, test, and package the code that you want to run on Databricks more easily.

For more details, refer Test Code in Databricks Notebooks.