PySpark image to run on Kubernetes' pods

I’ve been looking int how to build a custom Pyspark Docker Image to run on our companies Kubernetes pods (that run on scala). Have not found any examples that work. Anyone think that they can help?