Jupyter lab and Pyspark on compose

Hi,

Trying to run jupyter lab with spark service on docker compose, getting Initial job has not accepted any resources error. Any help would be much apreciatted. Full issue on stack overflow Docker compose - Jupyter notebook with Spark Cluster - Initial job has not accepted any resources - Stack Overflow