Connect to On premise Oracle database from Docker Container

Hello Team,

Is there any document which explains how we can connect to on premise Oracle database from any docker container.

From the connection perspective, it doesn’t matter if the application that connects to the database is running as a native process on the host kernel, or an isolated containerized process on the host kernel.

Thus said, can you be more specific about what your looking for exactly? Also wether it is about an existing publicly available image or a custom image you are looking for to build yourself?

Hello @meyay

I am using following two images from docker hub

  1. AWS glue image
    2)Postgresql

I am able to start docker container running for above images and was able setup communication between two containers using network bridge. Now as a next step either I need to create anothe container for Oracle and include it in same network
or I can connect to on premise Oracle database.

For First option while creating oracle image I am getting error :
https://yum.oracle.com/repo/OracleLinux/OL7/latest/x86_64/repodata/ab949214e7975e1774d0b9b84e915f974bf0a135-primary.sqlite.bz2: [Errno 14] HTTPS Error 403 - Forbidden

So I thought of connecting to on premise Oracle database instead of creating image. Now I am unable to understand how docker which I have created are running on network A will connect with my on premise oracle database.

Hope this will help you .

Nope, didn’t realy help to clearify the objective.

I guess you try to create a AWS Glue image, which you later on want to use as a AWS Glue ETL jobs. After it does its ETL transformation you are looking to store the data in an Oracle database?

I also assume that the core of your question is: how do I pass the connection string into the container, so you don’t have to hard code it.

If my assumptions are correct, you can use environment variables to inject the information into the container and read them from environments inside your python code with something like this:

import os
connection = os.environ['ORA_CONN']
username = os.environ['ORA_USER']
password = os.environ['ORA_PASS']

And create your container like this:

docker run -e ORA_CONN=however-the-string-must-look-like -e ORA_USER=username -e ORA_PASS=password ... <image-name> ...

Of course you will need to use the connection string, username and password that will provide access to your database. When using the image for the AWS Glue ETL job, you will need to find out how to pass environment variables to the ETL job.

After looking into the Glue documentation, I can see that a ETL job is supposed to write the data into a AWS Glue Data Catalog. Are you sure that Oracle can be used as Data Catalog?

I will guess, but I think the use of “on-premise” is what causes the misunderstanding. Since we are talking about containers and on-premise and not cloud and on-premise, I guess the real goal is to connect the Oracle database running on the host machine without being in a container.

If I am right and this is the case, I would search for “localhost” on the forum (top right corner) because this is a very frequent question. This is one of the relevant topics:

Hello @meyay

As @rimelek as mentioned I am trying to connect the Oracle database running on the host machine without being in a container.

And regarding data catalog , I will not use that feature .

I will try to find out the solution where @rimelek has mentioned .

I would like to thank both of you for taking out time for solving my problem.

I must admit the big picture was ambigous for me :smiley:

It was confusing that the question was about an on-prem Oracle batebase, then AWS Glue was introduced, which makes it appear as the whole setup is supposed to be run at AWS ultimately.
On-Prem does not realy implicate that it would be running localy on the docker host itself. Choise of words realy makes a difference as it helps to prevent assumptions in the wrong directions.

I hope you find your solution :slight_smile:

@purnima1612, since you show getting a 403 status code, that would suggest something about the host (the os or the db) rejecting the request for security reasons. While it could be about the username or password, I’d suspect it’s about the ip address.

For instance, it’s possible that oracle has been configured to limit what ips can reach the db. And it’s easy to fall into thinking that the docker container is “on the same machine” as the db, but then container gets its own ip and so will SEEM to be a different machine. And while you may be able to get the ip for the container and put that into oracle, beware that each time you run the docker images, their ip will change.

Something that may be useful also would be to look at the oracle logs, to see what it may be complaining about in these failures.

Another factor is in using or not using the --network flag on the docker run (or in the compose file). Have you tried it both ways? And have you tried it with --network=“host”.

Finally, you don’t show what values you may be trying as the host ip. As discussed in that other post, a common solution in this situation is to use host.docker.internal rather than localhost, for reaching the host (as localhost) from within the container. Whether that works may depend on the matters above.

Also, it may matter what os you’re using, and whether you are using Docker Desktop, as it enables some things that native docker installs do not.

Let us know if any of these help you move the ball down the field.