Not able to access Yarn logs running on a docker container cluster from browser

I have spark hadoop cluster running on docker containers. When I submit spark command using Yarn and try to see logs from browser I am not able see because worker container name is not accessible from the browser. I have to manually change URL to point to HostIP instead of worker container hostname to access logs. Is there anyway I can map yarn logs to Host server IP so that I can access from browser.

I tried giving worker hostname as server hostname so that I can access logs using server host name but spark submit not working because not able to find worker on the host server. Tried assigning yarn.nodemanager.webapp.https.address to host serverv IP but spark submit itself not working.

Hi. What is the zero at the beginning of your message?

Regarding your question, I’m afraid zero is how much I could help. You need someone who uses spark and runs it in Docker containers. I ran Spark on Kubernetes, but it was more than a year ago and I don’t understand your question. You could try to find a better place to ask your question here:

https://spark.apache.org/community.html

Of course you should also share what image you are using and how you run it. A simple description is not enough to understand the issue.

0 was typo and deleted it…Its not just spak, it is spark submit with Yarn as resource manager.

Spark submit is part of the spark project:

I worked with Yarn too a little, but that was even earlier. Maybe someone will come and help you here, but a Spark forum seems to be a much better place to ask your question.

Your question is more about how Spark works than how Docker works. If I can help with anything related to Docker and not just how Spark runs in Docker containers, I’m happy to help. Of course, if I have an idea about your question, I will still share it.